This is just a quick note that I have an Op-Ed appearing in the Baltimore Sun today. I discuss how the United States has seen a slow erosion in the appreciation for and respect of science. We need to recognize this trend, and fight back by engaging with our fellow citizens on scientific topics.
This Saturday, marches in support of science will be held in hundreds of cities across the globe. The event should be an excellent opportunity to reinject science back into the public consciousness.
The American Association for the Advancement of Science (AAAS), the world’s largest general scientific society, held an event on April 19 offering advice on how to advocate for science beyond the march. Here I share some of their strategies for interacting with Congress, the media, and the public.
Despite what many people think, citizens can influence Congress. In fact, a survey of those in positions of authority within Congressional offices reported that when their representative has not already arrived at a firm decision on an issue, contact from a constituent is about five times more persuasive than from a lobbyist.
Being influential, however, is about more than just being right. Congressional offices receive roughly 250 requests per day, so there are a few things you can do to stand out in an office that is essentially a triage unit.
- Ask for something concrete your representative can realistically deliver on.
- Explain why it is urgent.
- Make your pitch concise (< 10 minutes) and develop a one-page handout to leave after the meeting. Keep politics out of it!
- Be engaging! Tell a real story, preferably about someone who has one foot in your world, and one foot in your representative’s.
While your initial contacts with an office may be met with no response, be persistent. You can get that meeting!
Scientists are considered the most trustworthy spokespersons for science. But communicating effectively with the media requires that you do your homework and know your audience (e.g. business, technical, students).
You will want to have a well-honed, practiced elevator pitch. It should succinctly lay out the research problem, why it matters, and what the take home message is (i.e. what you can say that will lead to a longer conversation). You can always bridge back to it if you get questions you are not ready for, or if the interview otherwise is not going smoothly. Ask the reporter how they plan to frame the article. Use that as an opportunity to correct any inaccuracies.
It’s advantageous to build personal relationships with journalists. Inviting them to visit your laboratory, sending them relevant background information, connecting on social media, and just generally being cordial can help you become a trusted and go-to source.
Perhaps the most important question to ask yourself when communicating science to the public is, “Why am I doing this?” Perhaps it is to increase interest in science, or to share knowledge. Maybe you want to inspire the next generation to enter the discipline, or increase trust between scientists and the public.
Once you are clear about your purpose, abide by these tenets:
- Don’t “dumb down” your science or treat your audience like idiots. Disdain is an ineffective communication technique.
- Ditch the jargon. For example, the public has a different understanding of the phrase “positive feedback” than scientists do. Instead use something more clearly understood, like “vicious cycle.”
- Create a dialogue so that you know where your audience is at. Let them know they are being heard.
- Reverse the order of a scientific talk. Start with the conclusions, explain why the issue matters, then finish with the background details.
Be enthusiastic! Put your own face on science and demonstrate what keeps you motivated. Offer solutions, and sidestep landmines (e.g. focus on clean energy with someone who thinks climate change is a hoax).
Doing all of this on your own can be daunting and time consuming. Know the resources to make your life easier. Contact your university, institute, or relevant scientific society to collect their outreach materials. Find groups in your local community that you can partner with, like those who are already gathering an audience and where you might be permitted to speak.
There are many other available resources. Research!America holds science communication workshops that train people to better communicate medical research. Spectrum Science Communications helps “develop unique stories that create game-changing conversations to influence audiences and differentiate your brand.” AAAS is launching an advocacy toolkit, and many disciplinary organizations, like the Society for Neuroscience and American Physical Society have their own resources.
California governor Jerry Brown was a guest speaker at the American Geophysical Union’s Fall Meeting in San Francisco on December 14, 2016. A strong supporter and defender of science, Jerry Brown gave an impassioned speech regarding how California was going to stand up to the threats against science posed by the Trump administration. The governor’s spirit should serve as inspiration to scientists everywhere.
Here are some notable quotes from the address:
Donald Trump’s election has worried many Americans for a variety of reasons. One of those reasons – and one that was largely ignored during the campaign – is its impact on science. Given Trump’s lack of firm policy proposals and occasionally contradictory statements, there is much uncertainty in this regard. For that reason, I want to delve into what we can expect from the new Republican establishment in three key areas – science funding, climate change, and the role of science in government.
In all likelihood, the amount that the U.S. spends funding scientific research will be tightly linked to our total discretionary spending (i.e. non-military, non-entitlement). Trump has promised to dramatically increase military spending, keep entitlements fixed, and lower taxes without increasing the deficit. Discretionary spending would have to be cut under that scenario. While a budget for the current fiscal year (FY 2016-17) was supposed to be passed by October 1, Congress didn’t get it done in time. When this happens, they will pass a continuing resolution (CR) that continues funding the current year at the previous year’s levels.
That puts us in a position where one of two things is likely to happen. Either the current Congress can attempt to complete its own budget by the end of the year or, if it better serves their priorities, the Republicans can decide to pass another CR and wait to start fresh in 2017.
A continuing resolution may or may not be good news for scientists. The current proposed budget contains funding increases for some scientific agencies that could be lost if it goes unpassed. On the other hand, waiting until next year introduces the risk of significant spending cuts. Some of that money would probably be returned to the states, and could be redistributed to scientists through different channels, though that is far from guaranteed. Either way, scientific grants typically last for three to five years, so expect any funding changes to take years to work their way through the system.
It is important to distinguish between science that is nonideological, like health research, and that which has become ideological, like climate change. On the latter issue, Donald Trump has famously called climate change a “hoax” invented by the Chinese to reduce American competitiveness, a statement that ignores the substantial progress China is making in reducing its own emissions.
Trump has also expressed a desire increase usage of fossil fuels (including “clean coal”) and pull the U.S. out of the Paris Climate Agreement. While we are bound to this international treaty for at least the next four years, the President could opt to ignore its non-binding emissions targets. Failing to meet our commitments would diminish America’s moral authority and could disincentivize other nations, like India, from meeting their own targets.
America’s emissions pledges were based on a number of Obama-driven policies, like the Clean Power Plan (CPP), which directed the Environmental Protection Agency (EPA) to set limits on greenhouse gas emissions from power plants. The CPP will almost certainly be killed (expect legal challenges), but removing the federal requirement will not impede states from proceeding on their own, which many are. Furthermore, a Trump administration will be largely powerless to undo the economic forces that are leading to coal’s decline, chiefly the low price of natural gas.
Trump has expressed a desire to eliminate the EPA, but the agency will be difficult to do away with altogether, as this requires congressional approval and will be met by extremely strong political resistance. Heading the agency with noted climate denier Myron Ebell, as has been rumored, will not help matters, though. Ebell has called for the Senate to prohibit funding for the Paris agreement and the U.N.F.C.C.
However, the federal government is obligated under the 1970 Clean Air Act to regulate the emissions of carbon dioxide into the atmosphere. The Republicans may choose to defund the agency’s regulation efforts, an action that will almost certainly meet legal resistance from environmental groups and large swaths of the general public. While the Republicans will not be able to ignore the scientific evidence and mounting public pressure forever, any delay in implementation would be especially damaging given how far behind the curve we already are in our mitigation efforts.
Given Trump’s strong pro-fossil fuel statements, it’s possible that the Keystone XL pipeline will be approved by the U.S. State Department. Financial support for federally funded renewable energy technologies are at risk. The Alliance of Automobile Manufacturers has already requested of Trump’s transition team a rollback of the 54.5 miles per gallon fuel efficiency standards for cars and light-duty trucks by 2025.
A more general question is what role science will take within a Trump administration. President Obama nominated his chief science advisor John Holdren on inaugration day, signaling the position’s importance to his administration. Trump’s transition has been far less organized, and he has given little indication who his science advisor will be or what role they will serve. Even a qualified appointee could be effectively neutered if the Office of Science and Technology Policy (the office they would head) was disempowered, or if they were unable to permeate Trump’s inner circle. This position requires Senate confirmation, so it could potentially go unfilled for some time.
This would clearly be a mistake, as the next administration must be ready for future disasters like Deepwater Horizon or viral outbreaks that require being scientifically literate. It is unclear whether President Trump would prioritize the best scientific evidence over political considerations. The new administration will also have to consider whether the U.S. is to remain an active participant in international scientific enterprises like the International Thermonuclear Experimental Reactor (ITER) and whether there will be free movement of researchers. Trump’s tax proposals will answer whether he intends to incentivize private investment in basic research.
Executive agencies like the EPA and the National Oceanic and Atmospheric Administration (NOAA) are populated by career civil servants, many of whom are institutionally difficult to fire in order to protect them against political transitions. However, Trump has suggested downsizing the federal workforce by instituting a hiring freeze, reducing their job security, and reducing agency funding.
Even though Trump has expressed an interest in cutting the Department of Education, STEM education should largely be safe, especially since only about 10% of education funding comes from the federal government. Even Republicans realize that a highly educated workforce is a prerequisite for our international competitiveness.
Historically, science has been one of the few bipartisan issues. I suspect this will largely continue at the budgetary level, though the priorities may shift. I have reason to worry about federal climate mitigation efforts, but wonder whether Trump’s lack of a fully competent transition team might lead some lesser-known scientific programs to experience a kind of benign neglect. Either way, we must remain vigilant to ensure science is being represented as it should be.
Several years ago I found myself in a room with people on the forefront of the climate movement. Among their ranks were journalists, advocates, and members of nonprofit organizations. These science communicators had gathered to address an issue each of them had been grappling with – how do I find all of the information that I need and communicate it with the people that need to hear it?
The questions seemed so fundamental that I had assumed everyone in attendance already knew the answers. I didn’t, of course, because I was the outsider. As an astrophysicist, research for me is relatively straightforward. There are a limited set of journals that cover our field and a convenient web interface, NASA’s Astrophysics Data System (ADS), to search across their articles.1Friends in other fields have sung the praises of similar programs like EndNote and Mendeley. The program not only links users to all references in an article’s bibliography, but also reports which papers ended up citing that article. Smart engines could even recommend other papers to read based on your selections.
I have found tracking down information online in the realm of climate/energy policy to be more difficult. There are many more organizations doing independent research or running their own initiatives. Think tanks, NGOs, and government agencies are more likely to publish and promote on their own websites than through peer-reviewed journals. The impacts of climate change are so vast that they cut across traditional academic disciplines. They influence weather, oceans, atmospheres, ecosystems, human health, urban development, energy systems, breakthrough technologies, and many more.
When information is so widely dispersed, and we lack smart engines to find them automatically for us, what should our information collection strategy be? I don’t profess to have the “right answer” to this problem, should one even exist. But I’ve spent enough time gathering suggestions from others and trying them out for myself that I felt compelled to report some of the strategies and sources that have worked for me.
Before I begin, I want to comment that you can’t put everything together overnight. I’ve found that so much of the process is just keeping your ear to the ground. When an article I’m reading references an organization with which I’m unfamiliar, I jot it down. I visit their website, make a note about their mission and, if they have them, subscribe to their newsletter and Twitter feeds. I use Twitter lists to tag the feeds and keep them organized.
A great first source for content is Google, which offers among the best suite of tools for aggregating real-time news. Through Google News, you can personalize your news feed to return only the topics and regions you are interested in. The service allows you to specify whether you want content rarely, occasionally, sometimes, often, or always. Google Alerts goes a step further and contacts you when new information becomes available. Many news outlets offer the same capability.
If you are having difficulty deciding what’s important in the moment, the very cool newsmap may be the tool for you. Powered by Google’s search engine, newsmap visualizes the news by separating it into color-coded categories like World, National, Business, Technology, Sports, Entertainment, and Health. The color saturation reflects how old the story is, while the size shows how much it is being reported online. As with Google, you can filter by country and newsource. It’s a handy way to ascertain what’s hot right now.
Over time, or perhaps through a mentor, you may discover that your field has its own news/reference engines. Lawyers gather their research through the library database LexisNexis. Climate and energy folks have the Global Reference on the Environment, Energy, and Natural Resources (GREENR). Environment & Energy Publishing reports all the top developments. The news and analysis website Responding to Climate Change (RTCC) provides the latest news regarding low carbon developments.
Another great way to be exposed to new content is through Flipboard2and Zite, which it recently acquired. After signing up, Flipboard presents you with an absurd number of topics to choose from. They range from the conventional (e.g. religion, technology, art) to the more specific (e.g. industrial design, startups, social justice). You select the topics that interest you and Flipboard scours the web to produce a curated magazine readable on most devices. You can also stumble upon new content using, well, StumbleUpon. It has the same idea, but rather than curating material, it randomly deposits you at relevant webpages until you press a button to “stumble” to the next one. I have found a lot of really excellent content through this service.
Because the combined readership of an article or report is likely to possess more cumulative knowledge than the authors themselves, one should never discount the value of user comments. Sites like the New York Times and Ars Technica have great comment engines where user contributions can be elevated to “reader’s picks” or “editor’s picks”. It’s a great way to sample the wisdom of the masses and be exposed to a much broader perspective.
It literally took me years to assemble the repository of references I now possess. In the world of climate and energy policy, I found that information typically arrives in one of three forms – organizational reports, raw or lightly processed data, and independent projects.
Organizational reports are usually published by issue-focused research groups. For climate and energy, there are way more than I could name here. These include the National Academy of Sciences, the United States Global Change Research Program, the Union of Concerned Scientists, the Information Technology and Innovation Foundation, Brookings, Energy Innovation, and many more.
Two of my personal favorites are the George Mason University Center for Climate Change Communication (4C) and the Yale Project on Climate Change Communication. These academic centers were created to conduct unbiased social science research on how people engage with climate change. They discovered that people are more concerned about “global warming” than “climate change.” They reported what weathercasters think about climate change and its impact on weather, and questioned whether the level of sciencific consensus on climate change ought to be communicated numerically or non-numerically.
The second form information arrives in is raw or processed datasets. Government agencies like NASA and the National Oceanic and Atmospheric Administration (NOAA) are great resources here, as they have tons of images, datasets, and visualization tools that let you tell your own story from primary sources. The U.S. Energy Information Administration (EIA) and International Energy Agency (IEA) also offer tons of data to play around with.
Some groups are content to curate data in very specific ways. The Database of State Incentives for Renewables and Efficiency (DSIRE) categorizes state policies that promote renewable energy as either financial incentives or rules and regulations. Frack Track provides a self-described “geospatial policy tool” that analyzes and visualizes Pennsylvania’s new wave of gas development on the Marcellus shale. Wells, permitted sites, and locations of violations are provided on a map.
The third form is independent projects, a term that I’m admittedly using as something of a catchall. These include initiatives that aim to tell the story of climate change in unique ways. For example, for their project Atlantic Rising three friends started a journey to travel the 1-meter above sea level contour line to see what life would be like in a flooded world. They interacted with thousands of people in 22 countries gathering photos, film, and writings as they documented the changing lives of those along the rim.
Photographer John Weller believes the best way to protect the environment is by reminding people of nature’s visceral beauty. He spent a decade traveling to the rough waters of the Ross Sea, probably the last, undamaged ocean ecosystem left on earth. His stunning photographs of the region’s living creatures, both above and below the water, have been cataloged in the book The Last Ocean.
Finally, it is sometimes most useful to just speak to people personally. While conferences can be a great place to do this, these environments can be intimidating for newcomers to a field. There are some tricks you can employ to make this process go more smoothly, but I will reserve them for a future post.
Of course, simply having information is not enough. You must synthesize and deliver it to your audience in an effective way. This raises a whole new set of challenges that I will get into in my next post.
Notes [ + ]
|1.||↑||Friends in other fields have sung the praises of similar programs like EndNote and Mendeley.|
|2.||↑||and Zite, which it recently acquired|
In the last 15 years, science has experienced a revolution. The emergence of sophisticated sensor networks, digital imagery, Internet search and social media posts, and the fact that pretty much everyone is walking around with a smartphone in their pocket has enabled data collection on unprecedented scales. New supercomputers with petabytes of storage, gigabytes of memory, tens of thousands of processors, and the ability to transfer data over high speed networks permit scientists to understand that data like never before.
Research conducted under this new Big Data paradigm (aka eScience) falls into two categories – simulation and correlation. In simulations, scientists assume a model for how a system operates. By perturbing the model’s parameters and initial conditions, it becomes possible to predict outcomes under a variety of conditions. This technique has been used to study climate models, turbulent flows, nuclear science, and much more.
The second approach – correlation – involves gathering massive amount of real data from a system, then studying it to discover hidden relationships (i.e. correlations) between measured values. One example would be studying which combination of factors like drought, temperature, per capita GDP, cell phone usage, local violence, food prices, and more affect the migratory behavior of human populations.
At Johns Hopkins University (JHU) I work within a research collective known the Institute for Data Intensive Engineering and Science (IDIES). Our group specializes in using Big Data to solve problems in engineering and the physical and biological sciences. I attended the IDIES annual symposium on October 16, 2015 and heard presentations from researchers across a range of fields. In this article, I share some of their cutting edge research.
The United States spends a staggering $3.1 trillion in health care costs per year, or about 17% of GDP. Yet approximately 30% of that amount is wasted on unnecessary tests and diagnostic costs. Scientists are currently using Big Data to find new solutions that will maximize health returns while minimizing expense.
The costs of health care are more than just financial. They also include staff time and wait periods to process test results, often in environments where every minute matters. Dr. Daniel Robinson of JHU’s Department of Applied Mathematics & Statistics is working on processing vast quanties of hospital data through novel cost-reduction models in order to ultimately suggest a set of best practices.
On a more personal level, regular medical check-ups can be time consuming, expensive, and for some patients physically impossible. Without regular monitoring, it is difficult to detect warning signs of potentially fatal diseases. For example, Dr. Robinson has studied septic shock, a critical complication of sepsis that is the 13th leading cause of death in the United States, and the #1 cause within intensive care units. A better understanding of how symptoms like altered speech, elevated pain levels, and tiredness link to the risk of septic shock could say many lives.
Realizing this potential has two components. The first is data acquisition. New wearable devices like the Apple Watch, Fitbit, BodyGuardian, wearable textiles, and many others in development will enable real-time monitoring of a person’s vital statistics. These include heart rate, circadian rhythms, steps taken per day, energy expenditure, light exposure, vocal tone, and many more. These devices can also issue app-based surveys on a regular basis to check in on one’s condition.
Second, once scientists are able to determine which health statistics are indicative of which conditions, these monitors can suggest an appropriate course of action. This kind of individualized health care has been referred to as “precision medicine.” President Obama even promoted it in his 2015 State of the Union Address, and earned a bipartisan ovation in the process. A similar system is already working in Denmark where data culled from their electronic health network is helping predict when a person’s condition is about to worsen.
Dr. Jung Hee Seo (JHU – Mechanical Engineering) is using Big Data to predict when somebody is about to suffer an aneurysm. Because of the vast variety of aneurysm classifications, large data sets are critical for robust predictions. Dr. Seo intends to use his results to build an automated aneurysm hemodynamics simulation and risk data hub. Dr. Hong Kai Ji (JHU – Biostatistics) is doing similar research to predict genome-wide regulatory element activities.
The development of new materials is critical to the advancement of technology. Yet one might be surprised to learn just how little we know about our materials. For example, of the 50,000 to 70,000 known inorganic compounds, we only have elastic constants for about 200, dielectric constrants for 300-400, and superconductivity properties for about 1000.
This lack of knowledge almost guarantees that there are better materials out there for numerous applications, e.g. a compound that would help batteries be less corrosive while having higher energy densities. In the past, we’ve lost years simply because we didn’t know what our materials were capable of. For example, lithium iron phosphate was first synthesized in 1977, but we only learned it was useful in cathodes in 1997. Magnesium diboride was synthesized in 1952, but was only recognized as a superconductor in 2001.
Dr. Kristin Persson (UC Berkeley) and her team have been using Big Data to solve this problem in an new way. They create quantum mechanical models of a material’s structure, then probe their properties using computationally expensive simulations on supercomputers. Their work has resulted in The Materials Project. Through an online interface, researchers now have unprecendented access to the properties of tens of thousands of materials. They are also provided open analysis tools that can inspire the design of novel materials.
Another area where Big Data is playing a large role is in climate prediction. The challenge is using a combination of data points to generate forecasts for weather data across the world. For example, by measuring properties like temperature, wind speed, and humidity across the planet as a function of time, can we predict the weather in, say, Jordan?
Answering this question can be done either by using preconstructed models of climate behavior or by using statistical regression techniques. Dr. Ben Zaitchik (JHU – Earth & Planetary Sciences) and his team have attempted to answer that question by developing a web platform that allows the user to select both climate predictors and a statistical learning method (e.g. artificial neural networks, random forests, etc.) to generate a climate forecast. The application, which is fed by a massive spatial and temporal climate database, is slated to be released to the public in December.
Because local climate is driven by global factors, simulations at high resolution with numerous climate properties for both oceans and atmospheres can be absolutely gigantic. These are especially important since the cost of anchoring sensors to collect real ocean data can exceed tens of thousands of dollars per location.
Housing vacancy lies at the heart of Baltimore City’s problems. JHU assistant professor Tamas Budavári (Applied Mathematics & Statistics) has teamed up with the city to better understand the causes of the vacancy phenomenon. By utilizing over a hundred publicly available datasets, they have developed an amazing system of “blacklight maps” that allow users to visually inspect all aspects of the problem. By incorporating information like water, gas, and electricity consumption, postal records, parking violations, crime reports, and cell phone usage (are calls being made at 2pm or 2am?) we can begin to learn which factors correlate with vacancy, then take cost effective actions to alleviate the problem.
As Big Data proliferates, the potential for collaborative science increases in extraordinary ways. To this end, agencies like the National Institutes of Health (NIH) are pushing for data to become just as large a part of the citation network as journal articles. Their new initiative, Big Data to Knowledge (BD2K), is designed to enable biomedical research to be treated as a data-intensive digital research enterprise. If data from different research teams can be integrated, indexed, and standardized, it offers the opportunity for the entire research enterprise to become more efficient and less expensive, ultimately creating opportunities for more scientists to launch research initiatives.
My personal research uses Big Data to solve a problem caused by Big Data. In a world in which researchers have more data as their fingertips than ever before, the uncertainty caused by small sample sizes has decreased. As this so-called statistical noise drops, the dominant source of error is systematic noise. Like a scale that is improperly calibrated, systematic noise inhibits scientists from obtaining results that are both precise and accurate, regardless of how many measurements are taken.
In my dissertation, I developed a method to minimize noise in large data sets provided we have some knowledge about the distributions from which the signal and noise were drawn. By understanding the signal and noise correlations between different points in space, we can draw statistical conclusions about the most likely value of the signal given the data. The more correlations (i.e. points) that are used, the better our answer will be. However, large numbers of points require powerful computational resources. To get my answers, I needed to parallelize my operations over multiple processors in an environment with massive amounts (e.g. ~ 1TB) of memory.
Fortunately, our ability to process Big Data has recently taken a big step forward. Thanks to a $30 million grant from the state of Maryland, a new system called the Maryland Advanced Research Computing Center (MARCC) has just come online. This joint venture between JHU and the University of Maryland at College Park has created a collaborative research center that allows users to remotely access over 19,000 processors, 50 1TB RAM nodes with 48 cores, and 17 petabytes of storage capacity. By hosting the system under one roof, users share savings in facility costs and management, and work within a standardized environment. Turnaround time for researchers accustomed to smaller clusters will be drastically reduced. Scientists also have the option of colocating their own computing systems within the facility to reduce network transmission costs.
The era of Big Data in science, which started with the Sloan Digital Sky Survey in 2000, is now in full force. These are exciting times, and I cannot wait to see the fruits this new paradigm will bear for all of us.
The principle of American democracy is rooted in the “marketplace of ideas,” a notion that public policies are best developed through the honest and open deliberation of a wide variety of ideas. But the “marketplace” has strained of late. Our national challenges have grown more complex and the voices opining on them more numerous. From health care to energy policy to net neutrality, resolving modern problems requires more than an application of philosophy – it demands scientific literacy and an understanding of our national scientific apparatus.
Unfortunately, instead of facilitating discourse there are many who are content to muddy the waters. One of the worst offenders is conservative radio talk show host Rush Limbaugh. During his June 22, 2011 edition of The Rush Limbaugh Show he spoke once again on one of his “pet peeve issues,” climate change. Limbaugh, who has long rejected the consensus scientific conclusion that that Earth’s climate is changing and that human beings are responsible, was offering a new explanation for climate scientists’ behavior.
“They’ve been paid,” Limbaugh argued. “Their entire lifestyles, their standard of living depends on their grants that they get to conduct the studies, and they only get the money if they come up with the right result.”
One might be willing to dismiss such an inflammatory statement as isolated bloviation from one of media’s biggest loudmouths, if only it were an isolated incident. It is far from that. Similar statements have been made by authors, pundits, politicians, and even a handful of disgruntled scientists. In a speech to New Hampshire businessmen, former Texas governor and Republican presidential candidate Rick Perry echoed Limbaugh’s remarks referencing “a substantial number of scientists who have manipulated data so that they will have dollars rolling in to their projects.”
Statements such as these are not only slanderous, they are dangerous. Climate change is one of the greatest global challenges of our generation. It promises to deliver a warmer climate, droughts, floods, food and water scarcity, rising sea levels, and the death of 25-50% of Earth’s species (just to name a few) if not properly mitigated.
It is for these reasons that the profoundly misleading assaults on scientists’ basic integrity are so worrisome. The need to restore public faith in our scientific institutions warrants a substantive clarification about both the roles scientists play in society and the actual manner in which their research is funded.
In general, there are two classes of scientist – public and private. Public climate scientists are employed by government institutions like NASA and the National Oceanic and Atmospheric Administration (NOAA). NASA’s premiere climatologist, Dr. James Hansen, explains how public scientists are compensated saying, “Our salaries do not depend on how much research the government funds. Government scientists get paid for working 40 hours week, regardless of how long they work.”
Furthermore, to prevent against politically motivated terminations public scientists receive considerable protection from being fired. In such an environment scientists have little to fear from publishing results that cut across the grain since neither their compensation nor their job security depends on it.
Private climate scientists, on the other hand, are often employed by universities and must actively seek their own research funding. One common source is America’s collection of federal science agencies. There are many, but one of the most prominent is the National Science Foundation, an agency which supports about 20% of all federally funded basic research conducted in US universities. Its funding process is typical of agencies of this kind, so it is worth examining its appropriations process in greater detail.
Scientists apply for research grants by first submitting a research proposal. According to NSF criteria, successful proposals must demonstrate that their prospective research be of high academic quality, have high and hopefully broad significance, and preferably be transformative. Proposals are merit-reviewed by a panel of independent experts in the field and the top submissions receive grants to continue their work. This process is highly competitive. Of the approximately 45,000 proposals received each year, the NSF only funds about 11,500.
One noteworthy observation is that a plausible alternative to the theory to human-driven climate change satisfies all of these criteria. According to the National Academy of Sciences, between 97% and 98% of climate scientists actively publishing in the field currently agree with the conclusion that global climate change is occurring and is caused by human activity. Clearly, a plausible alternative would constitute a great scientific advancement, one which would likely have ramifications beyond climate science itself. So not only are “climate skeptics” not penalized in the grant process, if their proposals demonstrate legitimate scientific merit they might actually receive preferential treatment.
There are other factors that weigh in a climate skeptic’s favor. First, any scientist who can debunk a scientific paradigm (as Einstein did with his general theory of relativity) in favor of a better theory will earn prestige and a likely place for his name in science textbooks. This is a huge incentive to challenge the status quo. Second, if a professor has tenure, then he needn’t fear reprisal from his employer for conducting controversial research. Third, because review panels are comprised of a broad selection of experts, one can expect a representative plurality of opinions to be held by appropriators, which mitigates consensus groupthink. Fourth, scientists are skeptical by nature. They assume their knowledge is incomplete and are always acting to refine it. Scientists will tell you that one of the most exciting events for them is when an experimental result completely defies theoretical expectation. It is in these moments that new truths are often revealed. Scientists yearn for these moments. They do not penalize the search for them.
The final point I’ll make about the public grant process is simple common sense. It’s functionally impossible for allocators to only fund “pro-climate change” research when the results of that research are unknown until it is conducted. And even if you suspect incoming research proposals must tacitly accept anthropogenic global climate change a priori, meta-publication data gathered by Skeptical Scientist, an organization dedicated to explaining peer reviewed climate change research, reveals that approximately half of climate research papers do not explicitly endorse the consensus opinion, but rather function primarily as fact-finding missions. Those missions in total have created the consensus opinion, but scientists did not have to assume it before receiving their funding.
The other method by which private scientists obtain research support is by courting private donors and corporations who have a vested interest in it. For lots of basic research, this process of pitching for funds is a huge hassle. As the Microsoft computer scientist and Turing Award winner Jim Gray once put it, “Sometimes you have to kiss a lot of frogs before one turns into a prince.”
Except in certain cases the prince comes to you. Mitigating climate change requires a reorganization of large sectors of our economy. Consequently, corporations that stand to suffer financially in the transition have a strong incentive to spread disinformation themselves or fund others willing to do so.
In such cases, the exact opposite of Limbaugh’s argument is proven true. Scientists willing to research alternatives to anthropogenic climate change often receive funding because they reject the consensus opinion. In fact, research from the Global Warming Policy Foundation has found that in an analysis of 900 papers supporting climate change skepticism, 90% of the authors were linked to ExxonMobil.
As Dr. Hansen argues, “Perhaps, instead of questioning the motives of scientists, you should turn around and check the interests (motives) of the people who have pushed you to become so agitated.”
Once the public understands the true manner in which climate science is funded, it will ultimately need to ask itself which is more likely – that A) 97% of all active climate scientists have independently come together to collectively pull the wool over the world’s eyes and perpetrate the greatest scientific hoax of all time for unclear motives or B) moneyed interests like oil and coal companies who stand to lose profit in a world that addresses climate change are spreading doubt and disinformation as a means to forestall action.
Given the current state of media in the United States, the condition in which we find ourselves is not altogether surprising. Thinner margins have driven many newspapers and other news outlets to lay off dedicated science reporters. In the era of the 24-hour news cycle, ratings reign supreme and viewers are more likely to tune into conflict and controversy than a nuanced discussion of the facts. Even when climate science is given the coverage it deserves, the media will often mistake journalistic balance with “hearing all sides of an issue.” Granting climate skeptics equal air time with members of the 97% majority is akin to presenting the opinions of an Auschwitz survivor alongside someone who argues the Holocaust never happened.
Ultimately, it will fall upon scientists to lift the haze of misunderstanding that surrounds their work. They will need to be more vocal in communicating not just the science, but the process of practicing science. Only when the public gains an understanding of the scientific process will the baseless claim of Limbaugh and his sympathizers be exposed be exposed as the myth that it is.
This is final part of a 5 part series on the government’s silence of silence and the Freedom of Information Act (FOIA). Parts 1 through 4 can and should be read first:
In brief, these articles describe how scientific research gathered by the United States government is often withheld from the general public, a type of action that can quite literally put lives at risk. The Freedom of Information Act (FOIA) was passed to allow public access to these records, but both the George W. Bush and Obama administrations have so far failed to live up to the promise of the act.
But while there have been substantial challenges with gaining access to important public information, it’s not all doom and gloom. The fact that we actually have a Freedom of Information Act with an appeals process and judicial review is significant. The Act continues to have strong support in the NGO community. A FOIAonline portal has been built with the goal of eventually becoming a one-stop shop for public information. The Obama administration has taken a strong positive step at Data.gov to “increase public access to high value, machine readable datasets generated by the Executive Branch of the Federal Government.” This initiative has already saved on infrastructure costs.
And we have had disclosure successes. In 2008 the United States improved the Consumer Product Safety Act and created a searchable database for consumer information. The National Oceanic and Atmospheric Administration’s (NOAA) National Climatic Data Center and EPA have done an admirable job of reporting on historical climate variables like temperature, precipitation and drought. The US Embassy in Beijing has made electronic reports of air quality public when the Chinese government refused to do so. The federal ENERGY STAR program labels the energy footprint of appliances to aid consumers in making more energy efficient purchases.
Inside federal agencies, it would appear that some progress is being made. In 2013 UCS released a report entitled Grading Government Transparency in which they examined the ability of scientists at federal agencies to speak freely about their work. They found that many agencies’ media policies “have shown significant improvement since 2008.” In particular they note that scientists can now more easily apply their right to express personal views provided they make clear that they are not speaking for their agency.
This right was made considerably easier to exercise when on November 13, 2012, after an arduous 14 year journey, Congress unanimously passed the Whistleblower Protection Enhancement Act. This act, for the first time, provides specific legal protection to scientists and other federal employees who expose censorship or suppression of federal research. According to Celia Wexler of the Union for Concerned Scientists (UCS), “We hope that this law will begin a process to change the culture of federal agencies when it comes to whistleblowers. People who protect the public from unsafe drugs, tainted food, defective products, and environmental hazards should not fear for their jobs when they speak up for safety and scientific integrity.”
Since then, other steps have been taken to make it easier for the public to obtain government information. On May 9, 2013 President Obama issued an executive order making open and machine readable data the new default for government information. Citing examples like weather data and the Global Positioning System (GPS), the president argued that making federal data freely available “can help fuel entrepreneurship, innovation, and scientific discovery – all of which improve Americans’ lives.”
Then, on February 25, 2014 the US House of Representatives unanimously passed the FOIA Oversight and Implementation Act. This amendment to the Freedom of Information Act would create a single, free website from which all FOIA requests could be made. When requests are granted, federal agencies would have to release the information in an electronic and publicly accessible format. When requests are denied, the appeals process would be streamlined. The amendment also forces federal agencies to take greater responsibility for their FOIA obligations.
As we see, the system can work. But there will always be disagreements between the public and federal agencies regarding which information should be disclosed through FOIA and which should be withheld for security reasons. When public actors feel their claims have been rejected unjustly, they can always consider seeking subpoenas.
Absent that, there are other options at their disposal to extract greater value out of the information that is public. Private technology companies can offer tools for the sharing and analysis of data. Librarians can play a more prominent role in gathering and organizing documents.
When the information being disseminated is incorrect, knowledgeable scientists should take action. They can start issue blogs and connect with members of the media. Local groups like city councils rarely hear from scientists, so researchers can have an outsized impact in regional issues. As members of one of the most respected professions, scientists would do well to build relationships with congressional representatives or their science staffers. Failure to act means allowing dissembling voices fill the vacuum.
With respect to government disclosure, as with most things, the situation is neither entirely good nor bad. But it is hard to deny that at times we Americans live in a perverse, ironic ecosystem – one in which taxpayers fund government research designed to inform and protect, only to have that same government deny us the results and claim it’s for our protection. We must continue to hold our government accountable, push for transparency where appropriate and never yield to private interests who would use our ignorance against us.
I encourage everyone to check it out by clicking here. According to the Baltimore Sun’s publishing rules, they maintain 30-day exclusivity rights over the piece, so I won’t be able to post it on my website until mid-January.
This opinion piece, which was months in the making, highlights the extreme lack of scientific expertise in the halls of Congress. It gives examples, lists negative impacts and demonstrates how having at least some scientists in charge would be beneficial for the entire country. It will come as no surprise to anyone who knows me that climate change takes center stage in my argument.
Now here’s a little inside baseball. I’ve actually been pretty successful in getting my opinion pieces published in newspapers. I had my two previous letters to the editor (one on offshore drilling and the other on Keystone XL) published in the Baltimore Sun and one of those also in the NJ Star Ledger. Basically I was batting a thousand until this article.
So then a few weeks ago I submit a version through the Baltimore Sun website and hear nothing for like two weeks. Of course I’m thinking that they chose not to run it. So one morning I just decide to rewrite the entire thing. I kept certain phrases, but it was a total reorganization and shifting of the thesis. After bouncing it off some people, I resubmitted.
About three days later I get an email from the Sun’s deputy editorial page editor. She thought my first submission was interesting and well put together, but my contact info was cut off and she couldn’t respond! When I sent the second article, she recognized it as a variation of the first, got my phone number and email and we went ahead. In the end, the version you see here is a marriage of those two drafts.
I plan to extend this into a longer form article in the near future. After all, there’s a lot more than 750 words to say about this topic. I’m still trying to figure out where I can get it published (Science? Scientific American? American Physical Society News?). I’ve never done this before, so I welcome any advice.