Category Archives: Climate

ClimateEnergyResearch

Supercharge Your Internet Research with These Essential Tips

Banner Image - Working at Laptop

Several years ago I found myself in a room with people on the forefront of the climate movement. Among their ranks were journalists, advocates, and members of nonprofit organizations. These science communicators had gathered to address an issue each of them had been grappling with – how do I find all of the information that I need and communicate it with the people that need to hear it?

The questions seemed so fundamental that I had assumed everyone in attendance already knew the answers. I didn’t, of course, because I was the outsider. As an astrophysicist, research for me is relatively straightforward. There are a limited set of journals that cover our field and a convenient web interface, NASA’s Astrophysics Data System (ADS), to search across their articles.1Friends in other fields have sung the praises of similar programs like EndNote and Mendeley.  The program not only links users to all references in an article’s bibliography, but also reports which papers ended up citing that article. Smart engines could even recommend other papers to read based on your selections.

I have found tracking down information online in the realm of climate/energy policy to be more difficult. There are many more organizations doing independent research or running their own initiatives. Think tanks, NGOs, and government agencies are more likely to publish and promote on their own websites than through peer-reviewed journals. The impacts of climate change are so vast that they cut across traditional academic disciplines. They influence weather, oceans, atmospheres, ecosystems, human health, urban development, energy systems, breakthrough technologies, and many more.

When information is so widely dispersed, and we lack smart engines to find them automatically for us, what should our information collection strategy be? I don’t profess to have the “right answer” to this problem, should one even exist. But I’ve spent enough time gathering suggestions from others and trying them out for myself that I felt compelled to report some of the strategies and sources that have worked for me.

Before I begin, I want to comment that you can’t put everything together overnight. I’ve found that so much of the process is just keeping your ear to the ground. When an article I’m reading references an organization with which I’m unfamiliar, I jot it down. I visit their website, make a note about their mission and, if they have them, subscribe to their newsletter and Twitter feeds. I use Twitter lists to tag the feeds and keep them organized.

A great first source for content is Google, which offers among the best suite of tools for aggregating real-time news. Through Google News, you can personalize your news feed to return only the topics and regions you are interested in. The service allows you to specify whether you want content rarely, occasionally, sometimes, often, or always. Google Alerts goes a step further and contacts you when new information becomes available. Many news outlets offer the same capability.

If you are having difficulty deciding what’s important in the moment, the very cool newsmap may be the tool for you. Powered by Google’s search engine, newsmap visualizes the news by separating it into color-coded categories like World, National, Business, Technology, Sports, Entertainment, and Health. The color saturation reflects how old the story is, while the size shows how much it is being reported online. As with Google, you can filter by country and newsource. It’s a handy way to ascertain what’s hot right now.

Over time, or perhaps through a mentor, you may discover that your field has its own news/reference engines. Lawyers gather their research through the library database LexisNexis. Climate and energy folks have the Global Reference on the Environment, Energy, and Natural Resources (GREENR). Environment & Energy Publishing reports all the top developments. The news and analysis website Responding to Climate Change (RTCC) provides the latest news regarding low carbon developments.

Another great way to be exposed to new content is through Flipboard2and Zite, which it recently acquired. After signing up, Flipboard presents you with an absurd number of topics to choose from. They range from the conventional (e.g. religion, technology, art) to the more specific (e.g. industrial design, startups, social justice). You select the topics that interest you and Flipboard scours the web to produce a curated magazine readable on most devices. You can also stumble upon new content using, well, StumbleUpon. It has the same idea, but rather than curating material, it randomly deposits you at relevant webpages until you press a button to “stumble” to the next one.  I have found a lot of really excellent content through this service.

Because the combined readership of an article or report is likely to possess more cumulative knowledge than the authors themselves, one should never discount the value of user comments. Sites like the New York Times and Ars Technica have great comment engines where user contributions can be elevated to “reader’s picks” or “editor’s picks”. It’s a great way to sample the wisdom of the masses and be exposed to a much broader perspective.

It literally took me years to assemble the repository of references I now possess. In the world of climate and energy policy, I found that information typically arrives in one of three forms – organizational reports, raw or lightly processed data, and independent projects.

Organizational reports are usually published by issue-focused research groups. For climate and energy, there are way more than I could name here. These include the National Academy of Sciences, the United States Global Change Research Program, the Union of Concerned Scientists, the Information Technology and Innovation Foundation, Brookings, Energy Innovation, and many more.

Two of my personal favorites are the George Mason University Center for Climate Change Communication (4C) and the Yale Project on Climate Change Communication.  These academic centers were created to conduct unbiased social science research on how people engage with climate change. They discovered that people are more concerned about “global warming” than “climate change.”  They reported what weathercasters think about climate change and its impact on weather, and questioned whether the level of sciencific consensus on climate change ought to be communicated numerically or non-numerically.

The second form information arrives in is raw or processed datasets. Government agencies like NASA and the National Oceanic and Atmospheric Administration (NOAA) are great resources here, as they have tons of images, datasets, and visualization tools that let you tell your own story from primary sources. The U.S. Energy Information Administration (EIA) and International Energy Agency (IEA) also offer tons of data to play around with.

Some groups are content to curate data in very specific ways. The Database of State Incentives for Renewables and Efficiency (DSIRE) categorizes state policies that promote renewable energy as either financial incentives or rules and regulations. Frack Track provides a self-described “geospatial policy tool” that analyzes and visualizes Pennsylvania’s new wave of gas development on the Marcellus shale. Wells, permitted sites, and locations of violations are provided on a map.

The third form is independent projects, a term that I’m admittedly using as something of a catchall. These include initiatives that aim to tell the story of climate change in unique ways. For example, for their project Atlantic Rising three friends started a journey to travel the 1-meter above sea level contour line to see what life would be like in a flooded world. They interacted with thousands of people in 22 countries gathering photos, film, and writings as they documented the changing lives of those along the rim.

Photographer John Weller believes the best way to protect the environment is by reminding people of nature’s visceral beauty. He spent a decade traveling to the rough waters of the Ross Sea, probably the last, undamaged ocean ecosystem left on earth. His stunning photographs of the region’s living creatures, both above and below the water, have been cataloged in the book The Last Ocean.

Finally, it is sometimes most useful to just speak to people personally. While conferences can be a great place to do this, these environments can be intimidating for newcomers to a field. There are some tricks you can employ to make this process go more smoothly, but I will reserve them for a future post.

Of course, simply having information is not enough. You must synthesize and deliver it to your audience in an effective way. This raises a whole new set of challenges that I will get into in my next post.

 

Featured image: “tech worker” by Wrote, used under CC BY-NC 2.0 / bottom of image has been cropped from original

Notes   [ + ]

1. Friends in other fields have sung the praises of similar programs like EndNote and Mendeley.
2. and Zite, which it recently acquired
AstrophysicsClimate

How Big Data is Transforming Science

Banner Image - Big Data

In the last 15 years, science has experienced a revolution. The emergence of sophisticated sensor networks, digital imagery, Internet search and social media posts, and the fact that pretty much everyone is walking around with a smartphone in their pocket has enabled data collection on unprecedented scales. New supercomputers with petabytes of storage, gigabytes of memory, tens of thousands of processors, and the ability to transfer data over high speed networks permit scientists to understand that data like never before.

Research conducted under this new Big Data paradigm (aka eScience) falls into two categories – simulation and correlation. In simulations, scientists assume a model for how a system operates. By perturbing the model’s parameters and initial conditions, it becomes possible to predict outcomes under a variety of conditions. This technique has been used to study climate models, turbulent flows, nuclear science, and much more.

The second approach – correlation – involves gathering massive amount of real data from a system, then studying it to discover hidden relationships (i.e. correlations) between measured values. One example would be studying which combination of factors like drought, temperature, per capita GDP, cell phone usage, local violence, food prices, and more affect the migratory behavior of human populations.

At Johns Hopkins University (JHU) I work within a research collective known the Institute for Data Intensive Engineering and Science (IDIES).  Our group specializes in using Big Data to solve problems in engineering and the physical and biological sciences. I attended the IDIES annual symposium on October 16, 2015 and heard presentations from researchers across a range of fields. In this article, I share some of their cutting edge research.

 

HEALTH

The United States spends a staggering $3.1 trillion in health care costs per year, or about 17% of GDP. Yet approximately 30% of that amount is wasted on unnecessary tests and diagnostic costs. Scientists are currently using Big Data to find new solutions that will maximize health returns while minimizing expense.

The costs of health care are more than just financial. They also include staff time and wait periods to process test results, often in environments where every minute matters. Dr. Daniel Robinson of JHU’s Department of Applied Mathematics & Statistics is working on processing vast quanties of hospital data through novel cost-reduction models in order to ultimately suggest a set of best practices.

On a more personal level, regular medical check-ups can be time consuming, expensive, and for some patients physically impossible. Without regular monitoring, it is difficult to detect warning signs of potentially fatal diseases. For example, Dr. Robinson has studied septic shock, a critical complication of sepsis that is the 13th leading cause of death in the United States, and the #1 cause within intensive care units. A better understanding of how symptoms like altered speech, elevated pain levels, and tiredness link to the risk of septic shock could say many lives.

Realizing this potential has two components. The first is data acquisition. New wearable devices like the Apple Watch, Fitbit, BodyGuardian, wearable textiles, and many others in development will enable real-time monitoring of a person’s vital statistics. These include heart rate, circadian rhythms, steps taken per day, energy expenditure, light exposure, vocal tone, and many more. These devices can also issue app-based surveys on a regular basis to check in on one’s condition.

Second, once scientists are able to determine which health statistics are indicative of which conditions, these monitors can suggest an appropriate course of action. This kind of individualized health care has been referred to as “precision medicine.” President Obama even promoted it in his 2015 State of the Union Address, and earned a bipartisan ovation in the process. A similar system is already working in Denmark where data culled from their electronic health network is helping predict when a person’s condition is about to worsen.

Dr. Jung Hee Seo (JHU – Mechanical Engineering) is using Big Data to predict when somebody is about to suffer an aneurysm. Because of the vast variety of aneurysm classifications, large data sets are critical for robust predictions. Dr. Seo intends to use his results to build an automated aneurysm hemodynamics simulation and risk data hub. Dr. Hong Kai Ji (JHU – Biostatistics) is doing similar research to predict genome-wide regulatory element activities.

 

MATERIALS SCIENCE

The development of new materials is critical to the advancement of technology. Yet one might be surprised to learn just how little we know about our materials. For example, of the 50,000 to 70,000 known inorganic compounds, we only have elastic constants for about 200, dielectric constrants for 300-400, and superconductivity properties for about 1000.

This lack of knowledge almost guarantees that there are better materials out there for numerous applications, e.g. a compound that would help batteries be less corrosive while having higher energy densities. In the past, we’ve lost years simply because we didn’t know what our materials were capable of. For example, lithium iron phosphate was first synthesized in 1977, but we only learned it was useful in cathodes in 1997. Magnesium diboride was synthesized in 1952, but was only recognized as a superconductor in 2001.

Dr. Kristin Persson (UC Berkeley) and her team have been using Big Data to solve this problem in an new way. They create quantum mechanical models of a material’s structure, then probe their properties using computationally expensive simulations on supercomputers. Their work has resulted in The Materials Project.  Through an online interface, researchers now have unprecendented access to the properties of tens of thousands of materials. They are also provided open analysis tools that can inspire the design of novel materials.

 

CLIMATE

Another area where Big Data is playing a large role is in climate prediction. The challenge is using a combination of data points to generate forecasts for weather data across the world. For example, by measuring properties like temperature, wind speed, and humidity across the planet as a function of time, can we predict the weather in, say, Jordan?

Answering this question can be done either by using preconstructed models of climate behavior or by using statistical regression techniques. Dr. Ben Zaitchik (JHU – Earth & Planetary Sciences) and his team have attempted to answer that question by developing a web platform that allows the user to select both climate predictors and a statistical learning method (e.g. artificial neural networks, random forests, etc.) to generate a climate forecast. The application, which is fed by a massive spatial and temporal climate database, is slated to be released to the public in December.

Because local climate is driven by global factors, simulations at high resolution with numerous climate properties for both oceans and atmospheres can be absolutely gigantic. These are especially important since the cost of anchoring sensors to collect real ocean data can exceed tens of thousands of dollars per location.

 

URBAN HOUSING

Housing vacancy lies at the heart of Baltimore City’s problems. JHU assistant professor Tamas Budavári (Applied Mathematics & Statistics) has teamed up with the city to better understand the causes of the vacancy phenomenon. By utilizing over a hundred publicly available datasets, they have developed an amazing system of “blacklight maps” that allow users to visually inspect all aspects of the problem. By incorporating information like water, gas, and electricity consumption, postal records, parking violations, crime reports, and cell phone usage (are calls being made at 2pm or 2am?) we can begin to learn which factors correlate with vacancy, then take cost effective actions to alleviate the problem.

 

WHAT’S NEXT?

As Big Data proliferates, the potential for collaborative science increases in extraordinary ways. To this end, agencies like the National Institutes of Health (NIH) are pushing for data to become just as large a part of the citation network as journal articles. Their new initiative, Big Data to Knowledge (BD2K), is designed to enable biomedical research to be treated as a data-intensive digital research enterprise.  If data from different research teams can be integrated, indexed, and standardized, it offers the opportunity for the entire research enterprise to become more efficient and less expensive, ultimately creating opportunities for more scientists to launch research initiatives.

My personal research uses Big Data to solve a problem caused by Big Data. In a world in which researchers have more data as their fingertips than ever before, the uncertainty caused by small sample sizes has decreased.  As this so-called statistical noise drops, the dominant source of error is systematic noise. Like a scale that is improperly calibrated, systematic noise inhibits scientists from obtaining results that are both precise and accurate, regardless of how many measurements are taken.

In my dissertation, I developed a method to minimize noise in large data sets provided we have some knowledge about the distributions from which the signal and noise were drawn. By understanding the signal and noise correlations between different points in space, we can draw statistical conclusions about the most likely value of the signal given the data. The more correlations (i.e. points) that are used, the better our answer will be. However, large numbers of points require powerful computational resources. To get my answers, I needed to parallelize my operations over multiple processors in an environment with massive amounts (e.g. ~ 1TB) of memory.

Fortunately, our ability to process Big Data has recently taken a big step forward. Thanks to a $30 million grant from the state of Maryland, a new system called the Maryland Advanced Research Computing Center (MARCC) has just come online. This joint venture between JHU and the University of Maryland at College Park has created a collaborative research center that allows users to remotely access over 19,000 processors, 50 1TB RAM nodes with 48 cores, and 17 petabytes of storage capacity. By hosting the system under one roof, users share savings in facility costs and management, and work within a standardized environment. Turnaround time for researchers accustomed to smaller clusters will be drastically reduced. Scientists also have the option of colocating their own computing systems within the facility to reduce network transmission costs.

The era of Big Data in science, which started with the Sloan Digital Sky Survey in 2000, is now in full force. These are exciting times, and I cannot wait to see the fruits this new paradigm will bear for all of us.

 

Featured image: “server rack zoomed in” by CWCS Managed Hosting, used under CC BY 2.0 / image of server has been slightly windowed, “big data” words added

ClimatePolitics

Debunking the Notion That Climate Scientists Are Just in it for the Money

Banner Image - Money on Earth

The principle of American democracy is rooted in the “marketplace of ideas,” a notion that public policies are best developed through the honest and open deliberation of a wide variety of ideas. But the “marketplace” has strained of late. Our national challenges have grown more complex and the voices opining on them more numerous. From health care to energy policy to net neutrality, resolving modern problems requires more than an application of philosophy – it demands scientific literacy and an understanding of our national scientific apparatus.

Unfortunately, instead of facilitating discourse there are many who are content to muddy the waters. One of the worst offenders is conservative radio talk show host Rush Limbaugh. During his June 22, 2011 edition of The Rush Limbaugh Show he spoke once again on one of his “pet peeve issues,” climate change. Limbaugh, who has long rejected the consensus scientific conclusion that that Earth’s climate is changing and that human beings are responsible, was offering a new explanation for climate scientists’ behavior.

“They’ve been paid,” Limbaugh argued. “Their entire lifestyles, their standard of living depends on their grants that they get to conduct the studies, and they only get the money if they come up with the right result.”

One might be willing to dismiss such an inflammatory statement as isolated bloviation from one of media’s biggest loudmouths, if only it were an isolated incident. It is far from that. Similar statements have been made by authors, pundits, politicians, and even a handful of disgruntled scientists. In a speech to New Hampshire businessmen, former Texas governor and Republican presidential candidate Rick Perry echoed Limbaugh’s remarks referencing “a substantial number of scientists who have manipulated data so that they will have dollars rolling in to their projects.”

Statements such as these are not only slanderous, they are dangerous. Climate change is one of the greatest global challenges of our generation. It promises to deliver a warmer climate, droughts, floods, food and water scarcity, rising sea levels, and the death of 25-50% of Earth’s species (just to name a few) if not properly mitigated.

It is for these reasons that the profoundly misleading assaults on scientists’ basic integrity are so worrisome. The need to restore public faith in our scientific institutions warrants a substantive clarification about both the roles scientists play in society and the actual manner in which their research is funded.

In general, there are two classes of scientist – public and private. Public climate scientists are employed by government institutions like NASA and the National Oceanic and Atmospheric Administration (NOAA). NASA’s premiere climatologist, Dr. James Hansen, explains how public scientists are compensated saying, “Our salaries do not depend on how much research the government funds. Government scientists get paid for working 40 hours week, regardless of how long they work.”

Furthermore, to prevent against politically motivated terminations public scientists receive considerable protection from being fired. In such an environment scientists have little to fear from publishing results that cut across the grain since neither their compensation nor their job security depends on it.

Private climate scientists, on the other hand, are often employed by universities and must actively seek their own research funding.  One common source is America’s collection of federal science agencies. There are many, but one of the most prominent is the National Science Foundation, an agency which supports about 20% of all federally funded basic research conducted in US universities.  Its funding process is typical of agencies of this kind, so it is worth examining its appropriations process in greater detail.

Scientists apply for research grants by first submitting a research proposal.  According to NSF criteria, successful proposals must demonstrate that their prospective research be of high academic quality, have high and hopefully broad significance, and preferably be transformative.  Proposals are merit-reviewed by a panel of independent experts in the field and the top submissions receive grants to continue their work.  This process is highly competitive.  Of the approximately 45,000 proposals received each year, the NSF only funds about 11,500.

One noteworthy observation is that a plausible alternative to the theory to human-driven climate change satisfies all of these criteria.  According to the National Academy of Sciences, between 97% and 98% of climate scientists actively publishing in the field currently agree with the conclusion that global climate change is occurring and is caused by human activity. Clearly, a plausible alternative would constitute a great scientific advancement, one which would likely have ramifications beyond climate science itself.  So not only are “climate skeptics” not penalized in the grant process, if their proposals demonstrate legitimate scientific merit they might actually receive preferential treatment.

There are other factors that weigh in a climate skeptic’s favor. First, any scientist who can debunk a scientific paradigm (as Einstein did with his general theory of relativity) in favor of a better theory will earn prestige and a likely place for his name in science textbooks.  This is a huge incentive to challenge the status quo.  Second, if a professor has tenure, then he needn’t fear reprisal from his employer for conducting controversial research.  Third, because review panels are comprised of a broad selection of experts, one can expect a representative plurality of opinions to be held by appropriators, which mitigates consensus groupthink.  Fourth, scientists are skeptical by nature.  They assume their knowledge is incomplete and are always acting to refine it. Scientists will tell you that one of the most exciting events for them is when an experimental result completely defies theoretical expectation.  It is in these moments that new truths are often revealed.  Scientists yearn for these moments. They do not penalize the search for them.

The final point I’ll make about the public grant process is simple common sense.  It’s functionally impossible for allocators to only fund “pro-climate change” research when the results of that research are unknown until it is conducted.  And even if you suspect incoming research proposals must tacitly accept anthropogenic global climate change a priori, meta-publication data gathered by Skeptical Scientist, an organization dedicated to explaining peer reviewed climate change research, reveals that approximately half of climate research papers do not explicitly endorse the consensus opinion, but rather function primarily as fact-finding missions.  Those missions in total have created the consensus opinion, but scientists did not have to assume it before receiving their funding.

The other method by which private scientists obtain research support is by courting private donors and corporations who have a vested interest in it.  For lots of basic research, this process of pitching for funds is a huge hassle.  As the Microsoft computer scientist and Turing Award winner Jim Gray once put it, “Sometimes you have to kiss a lot of frogs before one turns into a prince.”

Except in certain cases the prince comes to you. Mitigating climate change requires a reorganization of large sectors of our economy. Consequently, corporations that stand to suffer financially in the transition have a strong incentive to spread disinformation themselves or fund others willing to do so.

In such cases, the exact opposite of Limbaugh’s argument is proven true. Scientists willing to research alternatives to anthropogenic climate change often receive funding because they reject the consensus opinion. In fact, research from the Global Warming Policy Foundation has found that in an analysis of 900 papers supporting climate change skepticism, 90% of the authors were linked to ExxonMobil.

As Dr. Hansen argues, “Perhaps, instead of questioning the motives of scientists, you should turn around and check the interests (motives) of the people who have pushed you to become so agitated.”

Once the public understands the true manner in which climate science is funded, it will ultimately need to ask itself which is more likely – that A) 97% of all active climate scientists have independently come together to collectively pull the wool over the world’s eyes and perpetrate the greatest scientific hoax of all time for unclear motives or B) moneyed interests like oil and coal companies who stand to lose profit in a world that addresses climate change are spreading doubt and disinformation as a means to forestall action.

Given the current state of media in the United States, the condition in which we find ourselves is not altogether surprising. Thinner margins have driven many newspapers and other news outlets to lay off dedicated science reporters. In the era of the 24-hour news cycle, ratings reign supreme and viewers are more likely to tune into conflict and controversy than a nuanced discussion of the facts. Even when climate science is given the coverage it deserves, the media will often mistake journalistic balance with “hearing all sides of an issue.” Granting climate skeptics equal air time with members of the 97% majority is akin to presenting the opinions of an Auschwitz survivor alongside someone who argues the Holocaust never happened.

Ultimately, it will fall upon scientists to lift the haze of misunderstanding that surrounds their work. They will need to be more vocal in communicating not just the science, but the process of practicing science. Only when the public gains an understanding of the scientific process will the baseless claim of Limbaugh and his sympathizers be exposed be exposed as the myth that it is.

 

Featured image: “Dollar Sign in Space – Illustration” by DonkeyHotey, used under CC BY 2.0 / slightly modified and black borders added to original

ClimateEnergyPolitics

FOIA – We Are Making Progress

This is final part of a 5 part series on the government’s silence of silence and the Freedom of Information Act (FOIA).  Parts 1 through 4 can and should be read first:

Part 1: The Kingston Disaster
Part 2: The Government’s Silence of Science
Part 3: Freedom of Information Act to the Rescue?
Part 4: The Obama Failure

In brief, these articles describe how scientific research gathered by the United States government is often withheld from the general public, a type of action that can quite literally put lives at risk.  The Freedom of Information Act (FOIA) was passed to allow public access to these records, but both the George W. Bush and Obama administrations have so far failed to live up to the promise of the act.

But while there have been substantial challenges with gaining access to important public information, it’s not all doom and gloom.  The fact that we actually have a Freedom of Information Act with an appeals process and judicial review is significant.  The Act continues to have strong support in the NGO community.  A FOIAonline portal has been built with the goal of eventually becoming a one-stop shop for public information.  The Obama administration has taken a strong positive step at Data.gov to “increase public access to high value, machine readable datasets generated by the Executive Branch of the Federal Government.”  This initiative has already saved on infrastructure costs.

And we have had disclosure successes.  In 2008 the United States improved the Consumer Product Safety Act and created a searchable database for consumer information.  The National Oceanic and Atmospheric Administration’s (NOAA) National Climatic Data Center and EPA have done an admirable job of reporting on historical climate variables like temperature, precipitation and drought.  The US Embassy in Beijing has made electronic reports of air quality public when the Chinese government refused to do so.  The federal ENERGY STAR program labels the energy footprint of appliances to aid consumers in making more energy efficient purchases.

Inside federal agencies, it would appear that some progress is being made.  In 2013 UCS released a report entitled Grading Government Transparency in which they examined the ability of scientists at federal agencies to speak freely about their work.  They found that many agencies’ media policies “have shown significant improvement since 2008.”  In particular they note that scientists can now more easily apply their right to express personal views provided they make clear that they are not speaking for their agency.

This right was made considerably easier to exercise when on November 13, 2012, after an arduous 14 year journey, Congress unanimously passed the Whistleblower Protection Enhancement Act.  This act, for the first time, provides specific legal protection to scientists and other federal employees who expose censorship or suppression of federal research.  According to Celia Wexler of the Union for Concerned Scientists (UCS), “We hope that this law will begin a process to change the culture of federal agencies when it comes to whistleblowers. People who protect the public from unsafe drugs, tainted food, defective products, and environmental hazards should not fear for their jobs when they speak up for safety and scientific integrity.”

Since then, other steps have been taken to make it easier for the public to obtain government information.  On May 9, 2013 President Obama issued an executive order making open and machine readable data the new default for government information.  Citing examples like weather data and the Global Positioning System (GPS), the president argued that making federal data freely available “can help fuel entrepreneurship, innovation, and scientific discovery – all of which improve Americans’ lives.”

Then, on February 25, 2014 the US House of Representatives unanimously passed the FOIA Oversight and Implementation Act.  This amendment to the Freedom of Information Act would create a single, free website from which all FOIA requests could be made.  When requests are granted, federal agencies would have to release the information in an electronic and publicly accessible format.  When requests are denied, the appeals process would be streamlined.  The amendment also forces federal agencies to take greater responsibility for their FOIA obligations.

As we see, the system can work.  But there will always be disagreements between the public and federal agencies regarding which information should be disclosed through FOIA and which should be withheld for security reasons.  When public actors feel their claims have been rejected unjustly, they can always consider seeking subpoenas.

Absent that, there are other options at their disposal to extract greater value out of the information that is public.  Private technology companies can offer tools for the sharing and analysis of data.  Librarians can play a more prominent role in gathering and organizing documents.

When the information being disseminated is incorrect, knowledgeable scientists should take action.  They can start issue blogs and connect with members of the media.  Local groups like city councils rarely hear from scientists, so researchers can have an outsized impact in regional issues.  As members of one of the most respected professions, scientists would do well to build relationships with congressional representatives or their science staffers.  Failure to act means allowing dissembling voices fill the vacuum.

With respect to government disclosure, as with most things, the situation is neither entirely good nor bad.  But it is hard to deny that at times we Americans live in a perverse, ironic ecosystem – one in which taxpayers fund government research designed to inform and protect, only to have that same government deny us the results and claim it’s for our protection.  We must continue to hold our government accountable, push for transparency where appropriate and never yield to private interests who would use our ignorance against us.

ClimatePolitics

My Baltimore Sun Op-Ed Has Been Published

I encourage everyone to check it out by clicking here. According to the Baltimore Sun’s publishing rules, they maintain 30-day exclusivity rights over the piece, so I won’t be able to post it on my website until mid-January.

This opinion piece, which was months in the making, highlights the extreme lack of scientific expertise in the halls of Congress. It gives examples, lists negative impacts and demonstrates how having at least some scientists in charge would be beneficial for the entire country. It will come as no surprise to anyone who knows me that climate change takes center stage in my argument.

Now here’s a little inside baseball. I’ve actually been pretty successful in getting my opinion pieces published in newspapers. I had my two previous letters to the editor (one on offshore drilling and the other on Keystone XL) published in the Baltimore Sun and one of those also in the NJ Star Ledger. Basically I was batting a thousand until this article.

So then a few weeks ago I submit a version through the Baltimore Sun website and hear nothing for like two weeks. Of course I’m thinking that they chose not to run it. So one morning I just decide to rewrite the entire thing. I kept certain phrases, but it was a total reorganization and shifting of the thesis. After bouncing it off some people, I resubmitted.

About three days later I get an email from the Sun’s deputy editorial page editor. She thought my first submission was interesting and well put together, but my contact info was cut off and she couldn’t respond! When I sent the second article, she recognized it as a variation of the first, got my phone number and email and we went ahead. In the end, the version you see here is a marriage of those two drafts.

I plan to extend this into a longer form article in the near future. After all, there’s a lot more than 750 words to say about this topic. I’m still trying to figure out where I can get it published (Science? Scientific American? American Physical Society News?). I’ve never done this before, so I welcome any advice.

ClimateEnergy

Johns Hopkins Feels the Power with Its Cogeneration Plant

Tucked away at the bottom of a small hill in a distant corner of Johns Hopkins University’s Homewood campus is a large brick building.  Metal pipes protrude horizontally from its side before diving perpendicularly into the ground.  Its tall, curved top windows, rooftop smokestack and mysterious purpose are vaguely reminiscent of the factory from the classic 1971 film Willy Wonka and the Chocolate Factory.  And much like the Wonka factory, no student ever goes in and no student ever comes out.

This building is the Homewood Power Plant, the facility responsible for providing electricity, heating, and cooling to the Homewood campus in central Baltimore City.  As part of Earth Week @ Johns Hopkins, the university’s Department of Facilities Management granted me a walkthrough to learn exactly what happens within.

The JHU Power Plant plays the same role as an electric utility’s generation station.  Fuel goes in and electricity comes out.  BGE, Maryland’s electricity provider, achieves 35% efficiency in this process.  What this means is that for every 100 units of fuel energy that goes in, 35 units of electricity comes out.  The remaining 65 units are expelled as waste heat through the Chesapeake Bay Cooling Tower.

Principles of thermodynamics and engineering limitations make it difficult to achieve higher efficiencies.  This is unless, of course, you manage to reclaim that waste heat for something useful.  This is where Homewood’s Cogeneration Facility comes into play.  Cogeneration (which stands for Combined Heat and Power System) is a process in which both electricity and useful heat (steam) are simultaneously produced.  The 65 units of waste heat, which would otherwise be discarded, are diverted to a waste heat recovery system.

 

 

Here are the basics.  Waste heat in the form of steam exits the primary generator at temperatures around 1000 °F.  The recovery system takes that steam and pumps it around the campus, eventually taking the form of building heat, hot water, and energy to feed Homewood campus’s four chilling plants.  All in all, the waste heat recovery system is able to wring about 45 extra units of steam energy from the 65 units of waste leading to an overall plant efficiency of 75-85%, approximately double what BGE could provide.  (For you wonks out there, the total electric capacity of the system is 3.8\times10^{7} kWh/year and total steam capacity is 210\times10^{6} pounds per year.)

 

 

 

“Cogen uses less energy to make electricity and heat and our process produces less greenhouse gases,” said Ed Kirk, a university energy engineer.  “Cogen reduces our energy use in one big chunk.”

The 4.2MW Power Plant, which opened in June 2010, is part of JHU’s plan to reduce its greenhouse gas emissions by 50% in the next 12 years.  “Our approach has us looking at energy efficiency, energy conservation, more sustainable energy choices and renewable energy,” added Mr. Kirk.

 

If this system seems like a no-brainer, then why doesn’t BGE do it itself?  The problem is that piping steam over long distances is an incredibly inefficient operation.  Much of the steam’s heat will be lost in the process.  That’s not to say utilities don’t do it.  It’s just that circumstances have to be right.  For example, Con Edison, New York City’s utility provider, uses a collection of cogeneration plants to heat around 100,000 densely packed buildings in Manhattan.  This explains the familiar image of steam rising through city sewer grates.  It also explains why cogeneration plants must be closely located to end-users.

Closer proximity also lessens transmission and distribution losses over power lines and provides greater security in unstable energy markets.   The high energy requirements and dense arrangements of college campuses, hospitals, military bases, etc. make them attractive candidates for this type of technology.

The Homewood cogeneration unit joins a small handful of others in the region.  Two have been installed at Johns Hopkins Hospital and a couple others are at the University of Maryland at College Park and Mercy Medical in central Baltimore.

The limiting factor prohibiting everyone from installing cogen plants is cost.  The Homewood campus was fortunate in that it had a large, unused space directly above one of its on-site combustion turbines.  The facility cost $7.5 million to install and will pay for itself in energy savings in about seven years.

 

Despite the plant’s net energy savings, this project was not always economically viable.  It was Maryland’s deregulation of electric utilities in 1999 that turned the tide.

“When electricity was deregulated, electricity prices rose,” said Craig Macomber, Chief Engineer at the Power Plant.  “At the same time, natural gas prices were falling.  These two conditions made our project feasible.”

 

 

The Homewood campus remains reliant on electricity purchased from BGE.  According to Mr. Macomber, when all things are considered the total amount of Homewood’s energy generated by the power plant is 20% in the summer and 30% in the winter.

Increased energy efficiency will lower the university’s carbon footprint by 8,650 metric tons per year, important for mitigating global climate change.  This falls in line with the goals of the JHU Office of Sustainability, whose stated mission is “to make Johns Hopkins University a showpiece of environmental leadership by demonstrating smart, sensible, and creative actions that promote the vision of sustainability.”

 

ClimateEnergyPolitics

Climate and Energy Primary Sources

There are tons of organizations that have done research related to energy, climate and policy.  Over the years I’ve aggregated a (non-comprehensive) list of those agencies.  If you would like to suggest additions, either post them in the comments or tweet them to me @mspecian and I’ll update the list.

Advanced Energy Economy

Alliance for Climate Protection

American Council for an Energy Efficient Economy

American Energy Innovation Council

American Security Project

American Wind Energy Association

Better Buildings Neighborhood Program from DOE

Bloomberg

Bloomberg New Energy Finance

Bureau of Labor Statistics

Carbon Tracker Initiative

Center for American Progress

Center for Climate Strategies

Center for Investigative Reporting

Chinese Renewable Energy Industries Association

Clean Energy Ministerial (CEM)

Climate Nexus

Clinton Foundation

CLIVAR

Consultative Group on International Agricultural Research – Climate Change, Agriculture, and Food Security

Cooperative Institute for Climate and Satellites

E3G – Change Elements for Sustainable Development

EcoGeek.org

Economics and Equity for the Environment (E3)

Economic Outlook Group

Energy Information Administration

Energy Self Reliant States

Environmental Defense Fund
Environmental Protection Agency

Environmental Research Letters

Environment American

EUMETSAT

European Wind Energy Association

Federal Energy Regulatory Commission

Forecast the Facts

Friends of the Earth UK

German Association of Energy and Water Industries (BDEW)

Global CCS Institute

Global Warming Policy Foundation

Google Earth Engine

Green Scissors Project

Greenwire

GTM Research (and energy consultancy)

The Guardian

Hart Research

Institute for Local Self Reliance

Institute of Public and Environmental Affairs (in Beijing)

Insurance Information Institute

International Council on Clean Transportation

International Research Institute for Climate and Society

InVEST – Integrated Valuation of Ecosystem Services & Tradeoffs

ITIF

Lawrence Berkley National Laboratory

League of Conservation Voters

Major Economies Forum

MIT Joint Program on the Science and Policy of Global Change

Munich Re’s Geo Risks Research

MyCity+20

National Center for Atmospheric Research

National Drought Mitigation Center

National Latino Coalition on Climate Change

National Oceanic and Atmospheric Association (NOAA) – National Climatic Data Center

National Renewable Energy Laboratory

National Round Table on the Environment and the Economy

National Wildlife Federation

Natural Resources Defense Council

The Nature Conservancy
North American Reliability Corporation

Northeast Energy Efficiency Partnership

Pecan Street – R&D on advanced technologies, advanced energy systems, and human interactions with them

Reuters

Rocky Mountain Institute

Safe Climate Campaign

Scott Polar research Institute at Cambridge University

Sierra Club

Solar Energy Industries Association

Southwest Climate Change Network

Surface Ocean Lower Atmosphere Study (SOLAS)

Union of Concerned Scientists

United Nations Environment Programme (UNEP)

University Corporation for Atmospheric Research

US Climate Change Science Program

US Defense Department

US Energy Information Administration

US Global Change Research Program

US Historical Climate Network (USHCN)

US Snow and Ice Data Centre (NSIDC)

US Transportation Department – Pipeline and Hazardous Materials Safety Administration

Visual Carbon

World Climate Research Programme

World Resources and Environmental Law

World Resources Institute

Zero Emissions Platform

 

 

Journals

 

Environmental Research Letters

Journal of Climate

Journal of Geophysics Research

Geophysical Research Letters

Nature Geoscience

AstrophysicsClimateEnergyPersonalPolitics

Articles Archive Added

I recognize that my website is sorta oddball. “Serious” articles on topics like green development in Africa, sustainability and climate are interspersed with professional wrestling results, games and personal photography. This motley assortment of content precludes this site from being a pure issues blog. While I have considered going in that direction, I built mikespecian.com to be a reflection of me along multiple dimensions. So for now I intend to keep it as is.

With one exception. I have added a link entitled Articles to my main menu. This is will be the one-stop-shop for everything I have written and will continue to write on topics such as climate, energy, politics and science in general. Thank you all for reading!

ClimateEnergyPolitics

My Silver Bullet for Solving the Energy Crisis

In the course of traveling through life, I occasionally intersect with others as passionate as I am about our world’s climate and energy crisis.  I love to pick people’s brains and most of the time I can’t stop myself from asking them, “If you had one silver bullet policy in your pocket that you could implement today, what would it be?”

I have received responses ranging from “sign the Kyoto Protocol” (which I perceive as small beer) to “remove corporate money from politics” (which, while probably the correct answer, is wholly unrealistic).

Through these discussions, I believe I have settled (at least for today) on an answer of my own: “promote international development through green growth.”  At a time when economic concerns drown out calls for foreign aid, I’m reminded of the saying, “The cleanest power plant is the one you never have to build.”  And nowhere is the need for new power as acute as in the developing world.

For some, a Third World green intervention seems like a misallocation of limited resources.  Why not just let them build a bunch of coal plants?  For others (me included), this need provides real opportunity.  In locations where firewood is the the primary sustainable resource, intelligent green investment can be sustainable in its own way – through profitability.

But with hundreds of international initiatives underway to support green growth, it’s easy to suffer from paralysis of too many options.  What are the key strategies?  Who’s doing what well?  Where is there room for improvement?

In the United States, we look to Silicon Valley as the model of an innovation ecosystem.  It is there that raw talent, research capability, and venture capital’s business-building power converge to create the planet’s premier environment for the generation of new products and wealth.  While Silicon Valley itself has shown little interest in the developing world, their model remains a gold standard and its strategies are easily transferable.

Nurturing talent must start with education.  The status quo of having one professor teaching standard courses to 1000 students will not get the job done.  Training students in the basics is key, but education needs to become less abstract and more vocational.  Let brewing beer be a study in chemistry.  Let cows be a study in biology.  If HP cannot offer copying equipment to parts of Africa due to a lack of qualified technicians, as was recently the case, teach technology to match the need.

Then, for research to be effective the world must work together.  China and the United States are behemoths, and science agencies like the US’s National Science Foundation offer much in the way of support.  Africa, however, is challenged by having 45 separate, smaller science foundations.  Regional agencies must be formed to bring these groups together.  If Rwanda relies solely upon its own scientists, it’s going to miss 99% of knowledge generated elsewhere.

Consider General Electric’s ecoimagination, an enterprise they describe on their website as “GE’s commitment to imagine and build innovative solutions to today’s environmental challenges while driving economic growth.”  Thus far, their research has proven capable of meeting global needs like lowering carbon emissions, increasing energy efficiency, developing/deploying wind and solar, and maximizing water conservation.  GE possesses massive resources, benefits from economies of scale and has a global presence.  There’s still plenty of room for improvement, from geothermal investments in Indonesia to new public transport systems in Central America and Asia.

But while technology is the glue between green and growth, solving the R&D problem alone doesn’t mean you have a competitive product.  It certainly doesn’t guarantee a valid business model, nor is it necessarily scalable.  For instance, a company the size of GE is not optimized to sell solar panels to villages one at a time.

So while nations like Burundi will seldom outperform the science team of a company like GE, that shouldn’t be their role.  Developing nations are much better positioned to understand their own needs, constraints and goals.  Perhaps they can host franchises that spin-off First World tech to deploy on village-sized scales.  Then, the smaller region’s needs can spur local innovations of First World “big box” technologies.

For example, to process coffee, beans must be washed, hulled, polished, sorted, etc.  A developing nation relying on its own technology will be priced out of the market by big box technology that scales.  But since the final coffee product depends keenly on the details of the processing method, innovations of big tech at local sites can provide an end product neither the First nor Third Worlds could have achieved entirely on their own.

However, research and business can only do so much.  If conditions on the ground are not fertile for green growth, roots won’t take hold.  Electricity cannot be transported if the government fails to maintain electrical wires.  If the state heavily subsidizes coal or oil, green technologies competitive in a free market won’t survive in a rigged one.  Without patent protection and sharing of intellectual property, tech transfer will not occur.  Agencies like the World Bank can be coaxed into giving their assistance, but they rarely lead.  The bed must first be set by gathering global support for investment, e.g. by connecting principle investigators in neighboring countries or by getting the World Bank to fund distributed solar (perhaps by crowdsourcing) in developing markets.

Many of these issues will be discussed in June at the Rio+20 Conference in Rio de Janeiro, Brazil.  If representatives can figure out how to link regional science foundations, introduce researchers to businesses (venture capital-style) and direct First World technology to Third World innovations, this might be the silver bullet most worth firing.