• Articles
  • Photography
    • American Southwest
      • American Southwest – Elite Gallery
      • American Southwest – Panoramas
      • American Southwest – Videos
    • Antelope Canyon
      • Antelope Canyon – Elite Gallery
    • Canyons
      • Canyons – Elite Gallery
      • Canyons – Panoramas
      • Canyons – Videos
    • Caribbean Cruise
    • Costa Rica
    • Czech
      • Czech – Elite Gallery
      • Czech – Panoramas
      • Czech – Videos
    • The Delaware Excursion
    • Falling Water
    • Germany
    • Halloween 2011
      • Halloween 2011 – Elite Photo Gallery
    • Halloween 2012
      • Halloween 2012 – Elite Photo Gallery
      • Halloween 2012 – Videos
    • India
      • India – Elite Gallery
      • India – Panoramas
      • India – Videos
    • Ireland
    • Maine Road Trip
    • Nashville
    • Northern Road Trip
      • Northern Road Trip – Part 1
      • Northern Road Trip – Part 2
      • Northern Road Trip – Part 3
    • Pacific Northwest
    • Rome
      • Rome – Elite Gallery
      • Rome – Panoramas
    • SoCal
    • Southeast Asia
    • Southern Road Trip
    • Western Road Trip
      • Western Road Trip – Part 1
      • Western Road Trip – Part 2
  • Tutoring
The Personal Website of Mike Specian
  • Articles
  • Photography
    • American Southwest
      • American Southwest – Elite Gallery
      • American Southwest – Panoramas
      • American Southwest – Videos
    • Antelope Canyon
      • Antelope Canyon – Elite Gallery
    • Canyons
      • Canyons – Elite Gallery
      • Canyons – Panoramas
      • Canyons – Videos
    • Caribbean Cruise
    • Costa Rica
    • Czech
      • Czech – Elite Gallery
      • Czech – Panoramas
      • Czech – Videos
    • The Delaware Excursion
    • Falling Water
    • Germany
    • Halloween 2011
      • Halloween 2011 – Elite Photo Gallery
    • Halloween 2012
      • Halloween 2012 – Elite Photo Gallery
      • Halloween 2012 – Videos
    • India
      • India – Elite Gallery
      • India – Panoramas
      • India – Videos
    • Ireland
    • Maine Road Trip
    • Nashville
    • Northern Road Trip
      • Northern Road Trip – Part 1
      • Northern Road Trip – Part 2
      • Northern Road Trip – Part 3
    • Pacific Northwest
    • Rome
      • Rome – Elite Gallery
      • Rome – Panoramas
    • SoCal
    • Southeast Asia
    • Southern Road Trip
    • Western Road Trip
      • Western Road Trip – Part 1
      • Western Road Trip – Part 2
  • Tutoring

Category Archives: Climate

BuildingsClimateEnergyResearch

Energy Efficiency Can Save Lives

Mike Specian May 1, 2020 Leave a Comment 2377 Views

In 2017 during Hurricane Irma, a tree branch hit a transformer and knocked out the main air conditioning system for a nursing home in Hollywood, Florida. There were portable air conditioners onsite, but they proved insufficient as indoor temperatures rose to a sweltering 100°F over the course of a three-day outage. Ultimately, 12 residents tragically lost their lives to the extreme heat.

Sadly, the conditions that precipitated this disaster are all too common, and are poised to become more so. Both the number of heatwaves in American cities and their duration have been increasing for decades due to climate change. In New York City alone, extreme heat claims 120 lives annually, and 80% of those deaths occur in people’s own homes.

To examine this issue, I co-authored a study in the journal Building and Environment with Drs. Kaiyu Sun and Tianzhen Hong, researchers at Lawrence Berkeley National Laboratory. Amidst the variety of options buildings have to improve thermal survivability during heat waves, power outages, and associated events, could energy efficiency play a role in keeping occupants safe? According to our results, the answer is yes.

It is well established that energy efficient buildings can save customers money, reduce greenhouse gas emissions, improve health and well-being, and make occupants more productive. Yet the extent to which efficiency impacts our ability to adapt to, withstand, and rapidly recover from heat-related disasters – i.e., to become more resilient – has been less well studied. And many that do look at the issue largely consider only hypothetical buildings.

However, in the case of the Hollywood nursing home, we had a real-world example of a failed building. Weather data during the outage were available, as was a measured indoor temperature. Dr. Sun gathered the actual building data (e.g., floor plans, building components, renovation history) through publicly available records and recreated the nursing home in EnergyPlus modeling software.

3D model of the The Rehabilitation Center at Hollywood Hills

Detailed floor plans of the first (upper) and second (lower) floors. Patient rooms are colored based on the number of occupant.

We introduced a variety of energy conservation measures like higher insulation, shielding windows and the roof with aluminum foil, applying a cool roof coating, reducing air infiltration, adding exterior shading, turning off miscellaneous electrical loads (e.g., lighting), and adding natural ventilation.

We discovered that most measures would have reduced the indoor heat index, a metric that accounts for both temperature and humidity, thereby enhancing thermal survivability. In addition, we found the most effective measure – simply opening the windows – would have cost nothing at all. However, some efficiency improvements were less beneficial than others, and one actually would have had a negative impact on resilience. Moreover, if we placed the same nursing home in a heat wave in a different city, the set of most effective measures would have changed.

Temperature comparison between one of the hottest patient rooms (solid line) and the outdoor environment (dotted line). Vertical lines represent the onset and end of the cooling outage incident.

Box plot containing the temperatures of all patient rooms over the course of the outage. Heat hazard classification is presented by the National Oceanic and Atmospheric Administration. Occupants in rooms in the “danger” zone would likely experience “heat cramps and heat exhaustion” with “heatstroke probable with continued activity”.

I’ll quote the paper itself:

Our analysis generated three high-level takeaways. First, energy efficiency is not uniformly beneficial for resilience, as different efficiency characteristics convey different resilience impacts. In particular, we found that reduced air infiltration—a staple of modern energy efficiency practices—actually made it more difficult for the nursing home to expel excess heat when indoor air temperature was higher than it was outdoors. And it would have, on its own, increased the heat index beyond the status quo. Second, the effectiveness of specific energy efficiency measures varied as a function of circumstance. By transplanting the Florida nursing home to Chicago and San Francisco during real heatwaves, we found that the value of individual measures varied as a function of multiple parameters, including climate zone, outdoor temperature, length of air conditioning outage, insolation, and local building codes. Third, the most effective efficiency measures were also the least expensive to implement. This encouraging result indicates that low-to no-cost measures could potentially be deployed in buildings in near-real time to enhance passive survivability by allowing residents to shelter-in-place.

The vertical axis represents the percentage of all room-hours with temperatures falling within each hazard classification zone. Our best estimate for what actually happened inside the nursing home is shown in the left column. The most successful individual measure, natural ventilation, reduced the percentage of room-hours in the “danger” zone from 32.3% down to 1.2%. Tightening the buidling envelope (right column) increased it to 36.4%.

I should caution that while many of these efficiency measures reduced the danger for occupants, they did not on their own make the building safe. Some additional form of energy storage was needed for that, but even 8 hours of chilled water cooling capacity would have been sufficient to keep most room-hours within the “safe” zone for the majority of the outage.

I encourage those interested in expanding the value of energy efficiency, and those eager for new pathways to enhance resilience to refer to this study. It not only demonstrates a real connection between energy efficient buildings and thermal survivability, but also that this nexus is nuanced and ripe for further exploration.

BuildingsClimateEnergy

What To Prioritize – Retrofits or New Construction?

Mike Specian July 3, 2019 Leave a Comment 3498 Views

Buildings in the United States consume about 40% of all energy and 75% of all electricity. Attempts to decarbonize our economy necessarily run through buildings. However, we have limited resources and what to prioritize is not always obvious. We could try to retrofit existing, inefficient homes. Or we could focus on new construction that is built efficient from the ground up. To explore this issue further, I recently moderated a debate on the following resolution:

When the two come into conflict, the federal government ought to prioritize resources for retrofit programs over new construction programs.

While debaters on both sides of the resolution agreed on the importance of improving the energy efficiency of both existing and new buildings, the competition of ideas led to a lively discussion about our nation’s research priorities and relationship with industry. The question is, undoubtedly, a complicated and multifaceted one. Therefore, we invited a representative from each side to share abridged versions of their arguments. Rhett will advocate for retrofits, and Newton for new construction.

If we could only choose one, should we focus on retrofit programs or new construction programs?

Rhett:The answer is retrofits. There are 118 million existing homes in the U.S., plus another 5-6 million commercial buildings. Over half those buildings are at least 40 years old, and they are generally very inefficient. You can save 20-30% of energy usage through simple interventions, and well beyond 50% if you improve the envelope. They represent a significantly larger opportunity for energy savings.
Newton:I hate to disagree, but the answer is new construction. We acknowledge that there are many, many more existing buildings than new buildings. However, there are about 1 million new homes built each year. And 33% of all existing homes in 2060 have yet to be built. If we do everything we can to make new construction more efficient, we will have addressed 33% of the entire market right there.

But we’re talking 124 million existing buildings compared to only 1 million new buildings per year. The opportunity presented by retrofit programs seems pretty overwhelming, no?

Newton:It’s not just about the number of buildings. We have to consider which type of buildings we can most effectively impact. Getting energy efficiency into new construction is far easier than existing homes.
Rhett:I agree that getting into existing homes is more difficult, but that doesn’t diminish the opportunity. Much more funding is still put into remodeling and simple energy equipment replacements each year, and we can build on that.
Newton:But the real question is where will those solutions come from? Where will they be developed, honed, and readied for the existing home market? That is much more challenging, and I would submit that all the innovations that spill into the retrofit market are coming from progress in new construction.

Why do you think the majority of innovations are being developed in new construction?

Newton:There are two reasons. The first is customer demand. New efficient buildings have energy bills that are a fraction of those in existing. They are also 2-3 times more comfortable and provide greater health benefits. Once people experience that contrast, it drives demand. The second is economies of scale. It’s easier to get innovations into new construction, so this is what drives the market for energy efficient technologies. Afterwards they work their way downstream into the retrofit market.
Rhett:Except we generally still see insufficient innovation in new construction. The sector has so many actors like architects, builders, manufacturers and others, and they are just not well-integrated. Where there is room for improvement – and where we are further ahead right now in other industries – is automation and prefabrication. This can be done either on-site or off-site. These innovations haven’t taken off in new construction, and we shouldn’t have to wait until they do.
Newton:I couldn’t agree more about the benefits of automation, but let’s look at a great example of where it works. In Sweden 85% of new homes are constructed off-site. This market helped improve the plans, machinery, digitized technology, and automation expertise that makes their new construction so effective. This automation infrastructure then transferred to retrofits through a program called Energiesprong, which is now a world influencer in the mass improvement of existing homes. But it wouldn’t be where it is if not for new construction.

How do health and equity fit in?

Rhett:It’s incumbent on the government to make sure these retrofit opportunities are available to everyone and easy to install at a reasonable cost. A greater percentage of the older, draftier homes have higher energy bills and, unfortunately, are occupied by individuals with lower incomes who don’t always have the ability to pay for improvements. Or they might be in a tenant-based situation where the owner is just not paying for them.
Newton:I fully agree on that. I’m just making a point that the solutions to achieve better health and equity will exist because the new home market enables the scale and the technology development. Then, the improvements you need are there faster than if you started with existing homes.
Rhett:But there’s an immediacy to this issue. If you only focus on new construction, you’re going to have either really high energy bills or you’re going to have people who are really suffering because they can’t afford to turn on their heat or their air conditioning. It’s on the government to come up with ideas and options that are there for everyone and not just for those who can afford a new home.
Newton:I don’t disagree with any of that. It’s only when they come into conflict and you have to make choices that you should opt for the infrastructure, the skills, and the installation expertise that you’ll get in the new home industry and then you can translate it over to existing buildings.

Where can the federal government have the greater impact?

Newton:To echo a point Rhett made earlier, we have about 100,000 contractors in the U.S. who work on new and existing buildings. That level of fragmentation makes it impossible to innovate and develop new solutions for industry. We have found that publicly-owned builders only invest less than 0.1% in innovation R&D, as compared to 4% for non-agricultural corporations. The only way we get innovation is through high-performance product manufacturers. The Department of Energy’s Building America program fills a huge gap in developing innovations, validating them in the field, and building consumer interest. Given the absence of investment in a fragmented industry, what we do in our nation’s new construction programs is vitally important.
Rhett:As Newton alluded to, industry does to some degree put money into product manufacturing because they want to continue to sell upscale versions of their technologies. But very few, if any, are putting money into tackling existing buildings in a wholesale manner. Right now cities are dealing with energy, environmental, and equity challenges. They realize they need to address problems in buildings that people are currently living and working in. The federal government is in a unique position to aggregate the interest in this area. It can push academia, the national labs, and industry to focus their ingenuity into retrofits. Together they can help retrofits be quick, attractive, easy to deploy, and affordable. That’s just not something that will happen on its own.
BuildingsClimateEnergy

Buildings – A New Hope to Solve Climate Change

Mike Specian November 24, 2018 Leave a Comment 2815 Views

Addressing climate change requires two approaches – mitigation of emissions, and adaptation to its impacts. In this PechaKucha presentation presented as part of the American Association for the Advancement of Science’s Visualizing Science Policy 20×20 event, I lay out the case for how buildings are a critical – if sometimes forgotten – part of the solution. I invite you to watch this talk on YouTube, or read the transcript below.

********

Last September during Hurricane Irma, a tree branch hit a transformer and knocked out power to the air conditioning system of a nursing home in Hollywood, Florida. There were portable air conditioners on site, but they were insufficient, and temperatures rose to a sweltering 110 degrees Fahrenheit. By time emergency responders realized the scope of the problem, 12 residents had tragically lost their lives.

It was situations like these that compelled me a few years ago to set aside my career as an astrophysicist and devote my attention towards the phenomenon that’s making extreme events like Hurricane Irma more intense – and that’s climate change.

Through this AAAS Fellowship I’ve had the privilege of working with the Department of Energy’s Building Technologies Office. And when I told people I’d be going to BTO they’d say…”Why? I thought you were concerned about climate change.” And I’d have to tell them that in United States buildings consume about 40% of all energy and 75% of all electricity. So if BTO could achieve its mission of making building technologies more energy efficient, not only could we create jobs and save tens of billions a dollars a year for Americans, we could also cut out a significant chunk of our greenhouse gas emissions, and begin to mitigate this massive problem.

So I’m going to put my salesman’s hat on for a second and sell you on two energy efficiency success stories. Number 1! This [pointing towards slide] is what refrigerators used to look like – bulky, ugly, expensive energy hogs. But through advances in technology refrigerators have more available space, yet only consume 1/4 the energy, cost 1/3 the price, and allow you to watch cable news right on the refrigerator door!

Number 2! Clothes dryers, which consume about 1% of energy in the U.S., largely by heating and evaporating water. But right now researchers at Oak Ridge National Laboratory are working on ultrasonic drying technology in which a rapidly vibrating membrane atomizes the water, which can then be siphoned off as a cool mist. If we can bring this to scale you will be able to dry your clothes in half the time with 1/5 the energy. And I’ve already got the slogan: The Ultrasonic Clothes Dryer – Taking your sock drawer, to Mach 4.

But mitigating emissions goes beyond just using less energy. It’s about using the right kind of energy. Every so often I’ll run into a young idealist who will say, “We need to go 100% renewable energy! More wind! More solar!” Our electric grid must balance generation and demand in real time. And while admire the idealism, how do we meet demand when the sun stops shining and the wind stops blowing?

There is a new device that’s made its way into about half of all buildings, and that number is rising. That device is the smart electricity meter. And what’s unique about it is that it enables utilities to send signals to buildings.

I want you to imagine the hottest day of the year. People are getting off work, driving home, and what’s one of the first things they do when they walk in the door? They turn on their air conditioners at the same time. These tend to be the highest demand hours of the year, and the grid has to be overbuilt to accommodate them. It would be like building a 100-lane highway just to accommodate Thanksgiving Day traffic. It’s great for a few hours per year, but then we have to pay to build and maintain all that infrastructure that most of the time is being underutilized. And the more lanes of the highway we drive on, the higher the toll – or in this case the price of electricity – gets for everybody.

Now smart meters allow utilities to send signals to buildings that are like, “Hey, we’re about to have a really expensive event on our hands. If you are willing, we will pay you to reduce your demand.” And literally with the instantaneous flip of a switch, buildings help the grid balance, including instances when variable renewable energy like solar and wind suddenly become unavailable. This is known as demand response.

Another way to help the grid balance is by storing excess solar and wind energy, then dispatching it later as needed. Yet going 100% renewable requires a ginormous amount of storage. We can get some of it from grid-scale pumped hydroelectric energy, and some of it from electrochemical batteries.

But there’s another way to store energy – in a building’s thermal mass. So imagine that you take a liquid material and embed it in the bricks that make up the wall of your building. It’s a hot, sunny day, so using available solar energy, the grid instructs your building to turn on its air conditioning at 2pm. The liquid material freezes, and AC shuts off at 5pm. The building then acts like a giant cooler, keeping the occupant comfortable without having to consume electricity at the worst part of the day.

And while all of this is fantastic, even if we could go zero carbon tomorrow, so much inertia has been built up in Earth’s system that global climate conditions would continue to deteriorate for decades to come. That means more extreme weather events, and more prolonged power outages.

Now it would be great if everyone could evacuate to safe locations, but for a variety of reasons that remains impractical or impossible for far too many people. That means we need ways to help people shelter-in-place safely. And if you need buildings to maintain safe thermal conditions longer and with less energy, two of the most valuable assets are high-quality walls and windows. Combine that with network connectivity and smarter controls, buildings will eventually be able to prepare themselves thermally and electrically when adverse conditions can be predicted ahead of time. And unlike centralized power plants or even solar panels, energy efficiency and demand response can be deployed absolutely anywhere.

Now look, I fully acknowledge that there are other resiliency strategies out there. Utilities must continue to harden our electrical distribution system, and communities should have up-to-date climate and disaster preparedness plans. But as long as climate change remains a wicked problem, everyone one of us, in our own capacities, is going to have to do what we can. Then maybe, collectively, we’ll get to the point where tragedies like the one in that Hollywood, Florida nursing home never have to happen again.

AstrophysicsClimatePolitics

‘Denigration of Science’ Op-Ed in Today’s Baltimore Sun

Mike Specian April 20, 2017 Leave a Comment 6756 Views

This is just a quick note that I have an Op-Ed appearing in the Baltimore Sun today. I discuss how the United States has seen a slow erosion in the appreciation for and respect of science. We need to recognize this trend, and fight back by engaging with our fellow citizens on scientific topics.

AstrophysicsClimatePoliticsResearch

10 Ways You Can Be a Better Advocate for Science

Mike Specian April 20, 2017 Leave a Comment 4072 Views

This Saturday, marches in support of science will be held in hundreds of cities across the globe. The event should be an excellent opportunity to reinject science back into the public consciousness.

The American Association for the Advancement of Science (AAAS), the world’s largest general scientific society, held an event on April 19 offering advice on how to advocate for science beyond the march. Here I share some of their strategies for interacting with Congress, the media, and the public.


CONGRESS

Despite what many people think, citizens can influence Congress. In fact, a survey of those in positions of authority within Congressional offices reported that when their representative has not already arrived at a firm decision on an issue, contact from a constituent is about five times more persuasive than from a lobbyist.

Being influential, however, is about more than just being right. Congressional offices receive roughly 250 requests per day, so there are a few things you can do to stand out in an office that is essentially a triage unit.

  • Ask for something concrete your representative can realistically deliver on.
  • Explain why it is urgent.
  • Make your pitch concise (< 10 minutes) and develop a one-page handout to leave after the meeting. Keep politics out of it!
  • Be engaging! Tell a real story, preferably about someone who has one foot in your world, and one foot in your representative’s.

While your initial contacts with an office may be met with no response, be persistent. You can get that meeting!

MEDIA

Scientists are considered the most trustworthy spokespersons for science. But communicating effectively with the media requires that you do your homework and know your audience (e.g. business, technical, students).

You will want to have a well-honed, practiced elevator pitch. It should succinctly lay out the research problem, why it matters, and what the take home message is (i.e. what you can say that will lead to a longer conversation). You can always bridge back to it if you get questions you are not ready for, or if the interview otherwise is not going smoothly. Ask the reporter how they plan to frame the article. Use that as an opportunity to correct any inaccuracies.

It’s advantageous to build personal relationships with journalists. Inviting them to visit your laboratory, sending them relevant background information, connecting on social media, and just generally being cordial can help you become a trusted and go-to source.

PUBLIC

Perhaps the most important question to ask yourself when communicating science to the public is, “Why am I doing this?” Perhaps it is to increase interest in science, or to share knowledge. Maybe you want to inspire the next generation to enter the discipline, or increase trust between scientists and the public.

Once you are clear about your purpose, abide by these tenets:

  • Don’t “dumb down” your science or treat your audience like idiots. Disdain is an ineffective communication technique.
  • Ditch the jargon. For example, the public has a different understanding of the phrase “positive feedback” than scientists do. Instead use something more clearly understood, like “vicious cycle.”
  • Create a dialogue so that you know where your audience is at. Let them know they are being heard.
  • Reverse the order of a scientific talk. Start with the conclusions, explain why the issue matters, then finish with the background details.

IN CONCLUSION

Be enthusiastic! Put your own face on science and demonstrate what keeps you motivated. Offer solutions, and sidestep landmines (e.g. focus on clean energy with someone who thinks climate change is a hoax).

Doing all of this on your own can be daunting and time consuming. Know the resources to make your life easier. Contact your university, institute, or relevant scientific society to collect their outreach materials. Find groups in your local community that you can partner with, like those who are already gathering an audience and where you might be permitted to speak.

There are many other available resources. Research!America holds science communication workshops that train people to better communicate medical research. Spectrum Science Communications helps “develop unique stories that create game-changing conversations to influence audiences and differentiate your brand.” AAAS is launching an advocacy toolkit, and many disciplinary organizations, like the Society for Neuroscience and American Physical Society have their own resources.

ClimateEnergyPolitics

Jerry Brown Spits Hot Fire at Meeting of the American Geophysical Union

Mike Specian January 5, 2017 Leave a Comment 4210 Views

California governor Jerry Brown was a guest speaker at the American Geophysical Union’s Fall Meeting in San Francisco on December 14, 2016.  A strong supporter and defender of science, Jerry Brown gave an impassioned speech regarding how California was going to stand up to the threats against science posed by the Trump administration.  The governor’s spirit should serve as inspiration to scientists everywhere.

Here are some notable quotes from the address:

Often when you’re moving along at a tepid pace, you’re not going to get there.  When someone [read: Trump] comes along and says, ‘Let’s blow it all up!” sometimes it wakes us up.  Some people need a heart attack to stop smoking.  Well maybe we just got a heart attack!

 

In California we have the scientists; we have the lawyers, and we’re ready to defend.

 

If Trump turns off the satellites, California will launch its own damn satellites.  We’re going to collect that data!

 

If they start deleting [climate] databases, we’ve got a lot of databases in California; we can take a few more.

 

Our new Secretary of Energy would come to California and say, ‘Come to Texas because we have all the jobs in Texas.’  Well Rick, I’ve got some news for you.  California’s growing a hell of a lot faster than Texas.  And we’ve got more sun than you have oil!  And we’re going to use it!
AstrophysicsClimateEnergyPolitics

What a Trump Presidency Means for Science

Mike Specian November 15, 2016 Leave a Comment 5770 Views

Donald Trump’s election has worried many Americans for a variety of reasons. One of those reasons – and one that was largely ignored during the campaign – is its impact on science. Given Trump’s lack of firm policy proposals and occasionally contradictory statements, there is much uncertainty in this regard. For that reason, I want to delve into what we can expect from the new Republican establishment in three key areas – science funding, climate change, and the role of science in government.

In all likelihood, the amount that the U.S. spends funding scientific research will be tightly linked to our total discretionary spending (i.e. non-military, non-entitlement).  Trump has promised to dramatically increase military spending, keep entitlements fixed, and lower taxes without increasing the deficit.  Discretionary spending would have to be cut under that scenario. While a budget for the current fiscal year (FY 2016-17) was supposed to be passed by October 1, Congress didn’t get it done in time. When this happens, they will pass a continuing resolution (CR) that continues funding the current year at the previous year’s levels.

That puts us in a position where one of two things is likely to happen. Either the current Congress can attempt to complete its own budget by the end of the year or, if it better serves their priorities, the Republicans can decide to pass another CR and wait to start fresh in 2017.

A continuing resolution may or may not be good news for scientists. The current proposed budget contains funding increases for some scientific agencies that could be lost if it goes unpassed. On the other hand, waiting until next year introduces the risk of significant spending cuts. Some of that money would probably be returned to the states, and could be redistributed to scientists through different channels, though that is far from guaranteed. Either way, scientific grants typically last for three to five years, so expect any funding changes to take years to work their way through the system.

It is important to distinguish between science that is nonideological, like health research, and that which has become ideological, like climate change. On the latter issue, Donald Trump has famously called climate change a “hoax” invented by the Chinese to reduce American competitiveness, a statement that ignores the substantial progress China is making in reducing its own emissions.

Trump has also expressed a desire increase usage of fossil fuels (including “clean coal”) and pull the U.S. out of the Paris Climate Agreement. While we are bound to this international treaty for at least the next four years, the President could opt to ignore its non-binding emissions targets. Failing to meet our commitments would diminish America’s moral authority and could disincentivize other nations, like India, from meeting their own targets.

America’s emissions pledges were based on a number of Obama-driven policies, like the Clean Power Plan (CPP), which directed the Environmental Protection Agency (EPA) to set limits on greenhouse gas emissions from power plants.  The CPP will almost certainly be killed (expect legal challenges), but removing the federal requirement will not impede states from proceeding on their own, which many are.  Furthermore, a Trump administration will be largely powerless to undo the economic forces that are leading to coal’s decline, chiefly the low price of natural gas.

Trump has expressed a desire to eliminate the EPA, but the agency will be difficult to do away with altogether, as this requires congressional approval and will be met by extremely strong political resistance.  Heading the agency with noted climate denier Myron Ebell, as has been rumored, will not help matters, though.  Ebell has called for the Senate to prohibit funding for the Paris agreement and the U.N.F.C.C.

However, the federal government is obligated under the 1970 Clean Air Act to regulate the emissions of carbon dioxide into the atmosphere. The Republicans may choose to defund the agency’s regulation efforts, an action that will almost certainly meet legal resistance from environmental groups and large swaths of the general public. While the Republicans will not be able to ignore the scientific evidence and mounting public pressure forever, any delay in implementation would be especially damaging given how far behind the curve we already are in our mitigation efforts.

Given Trump’s strong pro-fossil fuel statements, it’s possible that the Keystone XL pipeline will be approved by the U.S. State Department.  Financial support for federally funded renewable energy technologies are at risk.  The Alliance of Automobile Manufacturers has already requested of Trump’s transition team a rollback of the 54.5 miles per gallon fuel efficiency standards for cars and light-duty trucks by 2025.

A more general question is what role science will take within a Trump administration. President Obama nominated his chief science advisor John Holdren on inaugration day, signaling the position’s importance to his administration. Trump’s transition has been far less organized, and he has given little indication who his science advisor will be or what role they will serve. Even a qualified appointee could be effectively neutered if the Office of Science and Technology Policy (the office they would head) was disempowered, or if they were unable to permeate Trump’s inner circle.  This position requires Senate confirmation, so it could potentially go unfilled for some time.

This would clearly be a mistake, as the next administration must be ready for future disasters like Deepwater Horizon or viral outbreaks that require being scientifically literate. It is unclear whether President Trump would prioritize the best scientific evidence over political considerations. The new administration will also have to consider whether the U.S. is to remain an active participant in international scientific enterprises like the International Thermonuclear Experimental Reactor (ITER) and whether there will be free movement of researchers. Trump’s tax proposals will answer whether he intends to incentivize private investment in basic research.

Executive agencies like the EPA and the National Oceanic and Atmospheric Administration (NOAA) are populated by career civil servants, many of whom are institutionally difficult to fire in order to protect them  against political transitions.  However, Trump has suggested downsizing the federal workforce by instituting a hiring freeze, reducing their job security, and reducing agency funding.

Even though Trump has expressed an interest in cutting the Department of Education, STEM education should largely be safe, especially since only about 10% of education funding comes from the federal government. Even Republicans realize that a highly educated workforce is a prerequisite for our international competitiveness.

Historically, science has been one of the few bipartisan issues. I suspect this will largely continue at the budgetary level, though the priorities may shift. I have reason to worry about federal climate mitigation efforts, but wonder whether Trump’s lack of a fully competent transition team might lead some lesser-known scientific programs to experience a kind of benign neglect. Either way, we must remain vigilant to ensure science is being represented as it should be.

 

ClimateEnergyResearch

Supercharge Your Internet Research with These Essential Tips

Mike Specian November 9, 2015 Leave a Comment 5119 Views

Several years ago I found myself in a room with people on the forefront of the climate movement. Among their ranks were journalists, advocates, and members of nonprofit organizations. These science communicators had gathered to address an issue each of them had been grappling with – how do I find all of the information that I need and communicate it with the people that need to hear it?

The questions seemed so fundamental that I had assumed everyone in attendance already knew the answers. I didn’t, of course, because I was the outsider. As an astrophysicist, research for me is relatively straightforward. There are a limited set of journals that cover our field and a convenient web interface, NASA’s Astrophysics Data System (ADS), to search across their articles.1Friends in other fields have sung the praises of similar programs like EndNote and Mendeley.  The program not only links users to all references in an article’s bibliography, but also reports which papers ended up citing that article. Smart engines could even recommend other papers to read based on your selections.

I have found tracking down information online in the realm of climate/energy policy to be more difficult. There are many more organizations doing independent research or running their own initiatives. Think tanks, NGOs, and government agencies are more likely to publish and promote on their own websites than through peer-reviewed journals. The impacts of climate change are so vast that they cut across traditional academic disciplines. They influence weather, oceans, atmospheres, ecosystems, human health, urban development, energy systems, breakthrough technologies, and many more.

When information is so widely dispersed, and we lack smart engines to find them automatically for us, what should our information collection strategy be? I don’t profess to have the “right answer” to this problem, should one even exist. But I’ve spent enough time gathering suggestions from others and trying them out for myself that I felt compelled to report some of the strategies and sources that have worked for me.

Before I begin, I want to comment that you can’t put everything together overnight. I’ve found that so much of the process is just keeping your ear to the ground. When an article I’m reading references an organization with which I’m unfamiliar, I jot it down. I visit their website, make a note about their mission and, if they have them, subscribe to their newsletter and Twitter feeds. I use Twitter lists to tag the feeds and keep them organized.

A great first source for content is Google, which offers among the best suite of tools for aggregating real-time news. Through Google News, you can personalize your news feed to return only the topics and regions you are interested in. The service allows you to specify whether you want content rarely, occasionally, sometimes, often, or always. Google Alerts goes a step further and contacts you when new information becomes available. Many news outlets offer the same capability.

If you are having difficulty deciding what’s important in the moment, the very cool newsmap may be the tool for you. Powered by Google’s search engine, newsmap visualizes the news by separating it into color-coded categories like World, National, Business, Technology, Sports, Entertainment, and Health. The color saturation reflects how old the story is, while the size shows how much it is being reported online. As with Google, you can filter by country and newsource. It’s a handy way to ascertain what’s hot right now.

Over time, or perhaps through a mentor, you may discover that your field has its own news/reference engines. Lawyers gather their research through the library database LexisNexis. Climate and energy folks have the Global Reference on the Environment, Energy, and Natural Resources (GREENR). Environment & Energy Publishing reports all the top developments. The news and analysis website Responding to Climate Change (RTCC) provides the latest news regarding low carbon developments.

Another great way to be exposed to new content is through Flipboard2and Zite, which it recently acquired. After signing up, Flipboard presents you with an absurd number of topics to choose from. They range from the conventional (e.g. religion, technology, art) to the more specific (e.g. industrial design, startups, social justice). You select the topics that interest you and Flipboard scours the web to produce a curated magazine readable on most devices. You can also stumble upon new content using, well, StumbleUpon. It has the same idea, but rather than curating material, it randomly deposits you at relevant webpages until you press a button to “stumble” to the next one.  I have found a lot of really excellent content through this service.

Because the combined readership of an article or report is likely to possess more cumulative knowledge than the authors themselves, one should never discount the value of user comments. Sites like the New York Times and Ars Technica have great comment engines where user contributions can be elevated to “reader’s picks” or “editor’s picks”. It’s a great way to sample the wisdom of the masses and be exposed to a much broader perspective.

It literally took me years to assemble the repository of references I now possess. In the world of climate and energy policy, I found that information typically arrives in one of three forms – organizational reports, raw or lightly processed data, and independent projects.

Organizational reports are usually published by issue-focused research groups. For climate and energy, there are way more than I could name here. These include the National Academy of Sciences, the United States Global Change Research Program, the Union of Concerned Scientists, the Information Technology and Innovation Foundation, Brookings, Energy Innovation, and many more.

Two of my personal favorites are the George Mason University Center for Climate Change Communication (4C) and the Yale Project on Climate Change Communication.  These academic centers were created to conduct unbiased social science research on how people engage with climate change. They discovered that people are more concerned about “global warming” than “climate change.”  They reported what weathercasters think about climate change and its impact on weather, and questioned whether the level of sciencific consensus on climate change ought to be communicated numerically or non-numerically.

The second form information arrives in is raw or processed datasets. Government agencies like NASA and the National Oceanic and Atmospheric Administration (NOAA) are great resources here, as they have tons of images, datasets, and visualization tools that let you tell your own story from primary sources. The U.S. Energy Information Administration (EIA) and International Energy Agency (IEA) also offer tons of data to play around with.

Some groups are content to curate data in very specific ways. The Database of State Incentives for Renewables and Efficiency (DSIRE) categorizes state policies that promote renewable energy as either financial incentives or rules and regulations. Frack Track provides a self-described “geospatial policy tool” that analyzes and visualizes Pennsylvania’s new wave of gas development on the Marcellus shale. Wells, permitted sites, and locations of violations are provided on a map.

The third form is independent projects, a term that I’m admittedly using as something of a catchall. These include initiatives that aim to tell the story of climate change in unique ways. For example, for their project Atlantic Rising three friends started a journey to travel the 1-meter above sea level contour line to see what life would be like in a flooded world. They interacted with thousands of people in 22 countries gathering photos, film, and writings as they documented the changing lives of those along the rim.

Photographer John Weller believes the best way to protect the environment is by reminding people of nature’s visceral beauty. He spent a decade traveling to the rough waters of the Ross Sea, probably the last, undamaged ocean ecosystem left on earth. His stunning photographs of the region’s living creatures, both above and below the water, have been cataloged in the book The Last Ocean.

Finally, it is sometimes most useful to just speak to people personally. While conferences can be a great place to do this, these environments can be intimidating for newcomers to a field. There are some tricks you can employ to make this process go more smoothly, but I will reserve them for a future post.

Of course, simply having information is not enough. You must synthesize and deliver it to your audience in an effective way. This raises a whole new set of challenges that I will get into in my next post.

 

Featured image: “tech worker” by Wrote, used under CC BY-NC 2.0 / bottom of image has been cropped from original

Notes[+]

Notes
↑1 Friends in other fields have sung the praises of similar programs like EndNote and Mendeley.
↑2 and Zite, which it recently acquired
AstrophysicsClimate

How Big Data is Transforming Science

Mike Specian October 18, 2015 Leave a Comment 4387 Views

In the last 15 years, science has experienced a revolution. The emergence of sophisticated sensor networks, digital imagery, Internet search and social media posts, and the fact that pretty much everyone is walking around with a smartphone in their pocket has enabled data collection on unprecedented scales. New supercomputers with petabytes of storage, gigabytes of memory, tens of thousands of processors, and the ability to transfer data over high speed networks permit scientists to understand that data like never before.

Research conducted under this new Big Data paradigm (aka eScience) falls into two categories – simulation and correlation. In simulations, scientists assume a model for how a system operates. By perturbing the model’s parameters and initial conditions, it becomes possible to predict outcomes under a variety of conditions. This technique has been used to study climate models, turbulent flows, nuclear science, and much more.

The second approach – correlation – involves gathering massive amount of real data from a system, then studying it to discover hidden relationships (i.e. correlations) between measured values. One example would be studying which combination of factors like drought, temperature, per capita GDP, cell phone usage, local violence, food prices, and more affect the migratory behavior of human populations.

At Johns Hopkins University (JHU) I work within a research collective known the Institute for Data Intensive Engineering and Science (IDIES).  Our group specializes in using Big Data to solve problems in engineering and the physical and biological sciences. I attended the IDIES annual symposium on October 16, 2015 and heard presentations from researchers across a range of fields. In this article, I share some of their cutting edge research.

 

HEALTH

The United States spends a staggering $3.1 trillion in health care costs per year, or about 17% of GDP. Yet approximately 30% of that amount is wasted on unnecessary tests and diagnostic costs. Scientists are currently using Big Data to find new solutions that will maximize health returns while minimizing expense.

The costs of health care are more than just financial. They also include staff time and wait periods to process test results, often in environments where every minute matters. Dr. Daniel Robinson of JHU’s Department of Applied Mathematics & Statistics is working on processing vast quanties of hospital data through novel cost-reduction models in order to ultimately suggest a set of best practices.

On a more personal level, regular medical check-ups can be time consuming, expensive, and for some patients physically impossible. Without regular monitoring, it is difficult to detect warning signs of potentially fatal diseases. For example, Dr. Robinson has studied septic shock, a critical complication of sepsis that is the 13th leading cause of death in the United States, and the #1 cause within intensive care units. A better understanding of how symptoms like altered speech, elevated pain levels, and tiredness link to the risk of septic shock could say many lives.

Realizing this potential has two components. The first is data acquisition. New wearable devices like the Apple Watch, Fitbit, BodyGuardian, wearable textiles, and many others in development will enable real-time monitoring of a person’s vital statistics. These include heart rate, circadian rhythms, steps taken per day, energy expenditure, light exposure, vocal tone, and many more. These devices can also issue app-based surveys on a regular basis to check in on one’s condition.

Second, once scientists are able to determine which health statistics are indicative of which conditions, these monitors can suggest an appropriate course of action. This kind of individualized health care has been referred to as “precision medicine.” President Obama even promoted it in his 2015 State of the Union Address, and earned a bipartisan ovation in the process. A similar system is already working in Denmark where data culled from their electronic health network is helping predict when a person’s condition is about to worsen.

Dr. Jung Hee Seo (JHU – Mechanical Engineering) is using Big Data to predict when somebody is about to suffer an aneurysm. Because of the vast variety of aneurysm classifications, large data sets are critical for robust predictions. Dr. Seo intends to use his results to build an automated aneurysm hemodynamics simulation and risk data hub. Dr. Hong Kai Ji (JHU – Biostatistics) is doing similar research to predict genome-wide regulatory element activities.

 

MATERIALS SCIENCE

The development of new materials is critical to the advancement of technology. Yet one might be surprised to learn just how little we know about our materials. For example, of the 50,000 to 70,000 known inorganic compounds, we only have elastic constants for about 200, dielectric constrants for 300-400, and superconductivity properties for about 1000.

This lack of knowledge almost guarantees that there are better materials out there for numerous applications, e.g. a compound that would help batteries be less corrosive while having higher energy densities. In the past, we’ve lost years simply because we didn’t know what our materials were capable of. For example, lithium iron phosphate was first synthesized in 1977, but we only learned it was useful in cathodes in 1997. Magnesium diboride was synthesized in 1952, but was only recognized as a superconductor in 2001.

Dr. Kristin Persson (UC Berkeley) and her team have been using Big Data to solve this problem in an new way. They create quantum mechanical models of a material’s structure, then probe their properties using computationally expensive simulations on supercomputers. Their work has resulted in The Materials Project.  Through an online interface, researchers now have unprecendented access to the properties of tens of thousands of materials. They are also provided open analysis tools that can inspire the design of novel materials.

 

CLIMATE

Another area where Big Data is playing a large role is in climate prediction. The challenge is using a combination of data points to generate forecasts for weather data across the world. For example, by measuring properties like temperature, wind speed, and humidity across the planet as a function of time, can we predict the weather in, say, Jordan?

Answering this question can be done either by using preconstructed models of climate behavior or by using statistical regression techniques. Dr. Ben Zaitchik (JHU – Earth & Planetary Sciences) and his team have attempted to answer that question by developing a web platform that allows the user to select both climate predictors and a statistical learning method (e.g. artificial neural networks, random forests, etc.) to generate a climate forecast. The application, which is fed by a massive spatial and temporal climate database, is slated to be released to the public in December.

Because local climate is driven by global factors, simulations at high resolution with numerous climate properties for both oceans and atmospheres can be absolutely gigantic. These are especially important since the cost of anchoring sensors to collect real ocean data can exceed tens of thousands of dollars per location.

 

URBAN HOUSING

Housing vacancy lies at the heart of Baltimore City’s problems. JHU assistant professor Tamas Budavári (Applied Mathematics & Statistics) has teamed up with the city to better understand the causes of the vacancy phenomenon. By utilizing over a hundred publicly available datasets, they have developed an amazing system of “blacklight maps” that allow users to visually inspect all aspects of the problem. By incorporating information like water, gas, and electricity consumption, postal records, parking violations, crime reports, and cell phone usage (are calls being made at 2pm or 2am?) we can begin to learn which factors correlate with vacancy, then take cost effective actions to alleviate the problem.

 

WHAT’S NEXT?

As Big Data proliferates, the potential for collaborative science increases in extraordinary ways. To this end, agencies like the National Institutes of Health (NIH) are pushing for data to become just as large a part of the citation network as journal articles. Their new initiative, Big Data to Knowledge (BD2K), is designed to enable biomedical research to be treated as a data-intensive digital research enterprise.  If data from different research teams can be integrated, indexed, and standardized, it offers the opportunity for the entire research enterprise to become more efficient and less expensive, ultimately creating opportunities for more scientists to launch research initiatives.

My personal research uses Big Data to solve a problem caused by Big Data. In a world in which researchers have more data as their fingertips than ever before, the uncertainty caused by small sample sizes has decreased.  As this so-called statistical noise drops, the dominant source of error is systematic noise. Like a scale that is improperly calibrated, systematic noise inhibits scientists from obtaining results that are both precise and accurate, regardless of how many measurements are taken.

In my dissertation, I developed a method to minimize noise in large data sets provided we have some knowledge about the distributions from which the signal and noise were drawn. By understanding the signal and noise correlations between different points in space, we can draw statistical conclusions about the most likely value of the signal given the data. The more correlations (i.e. points) that are used, the better our answer will be. However, large numbers of points require powerful computational resources. To get my answers, I needed to parallelize my operations over multiple processors in an environment with massive amounts (e.g. ~ 1TB) of memory.

Fortunately, our ability to process Big Data has recently taken a big step forward. Thanks to a $30 million grant from the state of Maryland, a new system called the Maryland Advanced Research Computing Center (MARCC) has just come online. This joint venture between JHU and the University of Maryland at College Park has created a collaborative research center that allows users to remotely access over 19,000 processors, 50 1TB RAM nodes with 48 cores, and 17 petabytes of storage capacity. By hosting the system under one roof, users share savings in facility costs and management, and work within a standardized environment. Turnaround time for researchers accustomed to smaller clusters will be drastically reduced. Scientists also have the option of colocating their own computing systems within the facility to reduce network transmission costs.

The era of Big Data in science, which started with the Sloan Digital Sky Survey in 2000, is now in full force. These are exciting times, and I cannot wait to see the fruits this new paradigm will bear for all of us.

 

Featured image: “server rack zoomed in” by CWCS Managed Hosting, used under CC BY 2.0 / image of server has been slightly windowed, “big data” words added

ClimatePolitics

Debunking the Notion That Climate Scientists Are Just in it for the Money

Mike Specian September 14, 2015 Leave a Comment 4331 Views

The principle of American democracy is rooted in the “marketplace of ideas,” a notion that public policies are best developed through the honest and open deliberation of a wide variety of ideas. But the “marketplace” has strained of late. Our national challenges have grown more complex and the voices opining on them more numerous. From health care to energy policy to net neutrality, resolving modern problems requires more than an application of philosophy – it demands scientific literacy and an understanding of our national scientific apparatus.

Unfortunately, instead of facilitating discourse there are many who are content to muddy the waters. One of the worst offenders is conservative radio talk show host Rush Limbaugh. During his June 22, 2011 edition of The Rush Limbaugh Show he spoke once again on one of his “pet peeve issues,” climate change. Limbaugh, who has long rejected the consensus scientific conclusion that that Earth’s climate is changing and that human beings are responsible, was offering a new explanation for climate scientists’ behavior.

“They’ve been paid,” Limbaugh argued. “Their entire lifestyles, their standard of living depends on their grants that they get to conduct the studies, and they only get the money if they come up with the right result.”

One might be willing to dismiss such an inflammatory statement as isolated bloviation from one of media’s biggest loudmouths, if only it were an isolated incident. It is far from that. Similar statements have been made by authors, pundits, politicians, and even a handful of disgruntled scientists. In a speech to New Hampshire businessmen, former Texas governor and Republican presidential candidate Rick Perry echoed Limbaugh’s remarks referencing “a substantial number of scientists who have manipulated data so that they will have dollars rolling in to their projects.”

Statements such as these are not only slanderous, they are dangerous. Climate change is one of the greatest global challenges of our generation. It promises to deliver a warmer climate, droughts, floods, food and water scarcity, rising sea levels, and the death of 25-50% of Earth’s species (just to name a few) if not properly mitigated.

It is for these reasons that the profoundly misleading assaults on scientists’ basic integrity are so worrisome. The need to restore public faith in our scientific institutions warrants a substantive clarification about both the roles scientists play in society and the actual manner in which their research is funded.

In general, there are two classes of scientist – public and private. Public climate scientists are employed by government institutions like NASA and the National Oceanic and Atmospheric Administration (NOAA). NASA’s premiere climatologist, Dr. James Hansen, explains how public scientists are compensated saying, “Our salaries do not depend on how much research the government funds. Government scientists get paid for working 40 hours week, regardless of how long they work.”

Furthermore, to prevent against politically motivated terminations public scientists receive considerable protection from being fired. In such an environment scientists have little to fear from publishing results that cut across the grain since neither their compensation nor their job security depends on it.

Private climate scientists, on the other hand, are often employed by universities and must actively seek their own research funding.  One common source is America’s collection of federal science agencies. There are many, but one of the most prominent is the National Science Foundation, an agency which supports about 20% of all federally funded basic research conducted in US universities.  Its funding process is typical of agencies of this kind, so it is worth examining its appropriations process in greater detail.

Scientists apply for research grants by first submitting a research proposal.  According to NSF criteria, successful proposals must demonstrate that their prospective research be of high academic quality, have high and hopefully broad significance, and preferably be transformative.  Proposals are merit-reviewed by a panel of independent experts in the field and the top submissions receive grants to continue their work.  This process is highly competitive.  Of the approximately 45,000 proposals received each year, the NSF only funds about 11,500.

One noteworthy observation is that a plausible alternative to the theory to human-driven climate change satisfies all of these criteria.  According to the National Academy of Sciences, between 97% and 98% of climate scientists actively publishing in the field currently agree with the conclusion that global climate change is occurring and is caused by human activity. Clearly, a plausible alternative would constitute a great scientific advancement, one which would likely have ramifications beyond climate science itself.  So not only are “climate skeptics” not penalized in the grant process, if their proposals demonstrate legitimate scientific merit they might actually receive preferential treatment.

There are other factors that weigh in a climate skeptic’s favor. First, any scientist who can debunk a scientific paradigm (as Einstein did with his general theory of relativity) in favor of a better theory will earn prestige and a likely place for his name in science textbooks.  This is a huge incentive to challenge the status quo.  Second, if a professor has tenure, then he needn’t fear reprisal from his employer for conducting controversial research.  Third, because review panels are comprised of a broad selection of experts, one can expect a representative plurality of opinions to be held by appropriators, which mitigates consensus groupthink.  Fourth, scientists are skeptical by nature.  They assume their knowledge is incomplete and are always acting to refine it. Scientists will tell you that one of the most exciting events for them is when an experimental result completely defies theoretical expectation.  It is in these moments that new truths are often revealed.  Scientists yearn for these moments. They do not penalize the search for them.

The final point I’ll make about the public grant process is simple common sense.  It’s functionally impossible for allocators to only fund “pro-climate change” research when the results of that research are unknown until it is conducted.  And even if you suspect incoming research proposals must tacitly accept anthropogenic global climate change a priori, meta-publication data gathered by Skeptical Scientist, an organization dedicated to explaining peer reviewed climate change research, reveals that approximately half of climate research papers do not explicitly endorse the consensus opinion, but rather function primarily as fact-finding missions.  Those missions in total have created the consensus opinion, but scientists did not have to assume it before receiving their funding.

The other method by which private scientists obtain research support is by courting private donors and corporations who have a vested interest in it.  For lots of basic research, this process of pitching for funds is a huge hassle.  As the Microsoft computer scientist and Turing Award winner Jim Gray once put it, “Sometimes you have to kiss a lot of frogs before one turns into a prince.”

Except in certain cases the prince comes to you. Mitigating climate change requires a reorganization of large sectors of our economy. Consequently, corporations that stand to suffer financially in the transition have a strong incentive to spread disinformation themselves or fund others willing to do so.

In such cases, the exact opposite of Limbaugh’s argument is proven true. Scientists willing to research alternatives to anthropogenic climate change often receive funding because they reject the consensus opinion. In fact, research from the Global Warming Policy Foundation has found that in an analysis of 900 papers supporting climate change skepticism, 90% of the authors were linked to ExxonMobil.

As Dr. Hansen argues, “Perhaps, instead of questioning the motives of scientists, you should turn around and check the interests (motives) of the people who have pushed you to become so agitated.”

Once the public understands the true manner in which climate science is funded, it will ultimately need to ask itself which is more likely – that A) 97% of all active climate scientists have independently come together to collectively pull the wool over the world’s eyes and perpetrate the greatest scientific hoax of all time for unclear motives or B) moneyed interests like oil and coal companies who stand to lose profit in a world that addresses climate change are spreading doubt and disinformation as a means to forestall action.

Given the current state of media in the United States, the condition in which we find ourselves is not altogether surprising. Thinner margins have driven many newspapers and other news outlets to lay off dedicated science reporters. In the era of the 24-hour news cycle, ratings reign supreme and viewers are more likely to tune into conflict and controversy than a nuanced discussion of the facts. Even when climate science is given the coverage it deserves, the media will often mistake journalistic balance with “hearing all sides of an issue.” Granting climate skeptics equal air time with members of the 97% majority is akin to presenting the opinions of an Auschwitz survivor alongside someone who argues the Holocaust never happened.

Ultimately, it will fall upon scientists to lift the haze of misunderstanding that surrounds their work. They will need to be more vocal in communicating not just the science, but the process of practicing science. Only when the public gains an understanding of the scientific process will the baseless claim of Limbaugh and his sympathizers be exposed be exposed as the myth that it is.

 

Featured image: “Dollar Sign in Space – Illustration” by DonkeyHotey, used under CC BY 2.0 / slightly modified and black borders added to original

1 2 … 4 Next

About Me

mike_specian

Hi, I’m Mike Specian. This site is a repository for things that matter to me including science, energy, climate, public policy, and photography from around the world. You are welcome to follow me on social media or by subscribing to email updates below.

Subscribe

Sign up for email updates when new content is posted.

Follow me on social media

Gallery

Morraine-Lakes-Hills-and-Boats-edits Moss-Covered-Mailbox-edit Mt-Robson-Moon-Behind-Glacier-Version-1 Plitvice-Lakes-Middle Cathedral-Lake-Stars-over-Lake Cathedral-Lake-Sunset-with-Tree Knoblock-Transparent-Milky-Way Moonrise-over-the-Indian-Ocean Driftwood-and-Boat leon-viejo-memorial-a-los-fundadores-espanoles-de-leon-santiago-de-los-caballeros ometepe-michaels-strip-of-land-looking-towards-concepcion ometepe-island-panorama-with-concepcion-and-partial-maderas-in-background

Most Recent Posts

  • I visited Iceland and Paris in 2010. I’m finally ready to share what happened.
  • Energy Efficiency Can Save Lives
  • An Interview with FERC Chairman Neil Chatterjee
  • What To Prioritize – Retrofits or New Construction?
  • Buildings – A New Hope to Solve Climate Change

Categories

  • Astrophysics
  • Buildings
  • Climate
  • Energy
  • Personal
  • Photography
  • Politics
  • Pro Wrestling
  • Research
  • Travel
  • Video Games
© Copyright 2014. Theme by BloomPixel.