Astrophysics

Is the Universe a Giant Fractal?

fractal universe

Fractals are objects that look the same on all scales.  I’m sure many of you have seen pictures or videos of fractals, but if you haven’t or if you would like a reminder, check out this visual representation posted on YouTube.  As a cosmologist who has studied the large scale structure of the Universe, I find the question of whether the Universe is itself a giant fractal pretty interesting.

Before we can dive deeper into this question, some background information is required.  The prevailing conclusion in cosmology is that the Universe originated in a Big Bang from which all matter and energy was set in motion.  Though it was initially very close to uniform, tiny quantum perturbations made certain sections of the Universe slightly more dense than others.  As gravity directed matter into these overdense regions structure slowly began to form.  After billions of years this structure evolved into a massive collection of filaments and voids.  The following video from the Millennium simulation displays a model of that structure on different length scales.

As the video shows, the Universe does appear somewhat similar on all scales except the smallest.  That the Universe fails being a fractal at small scales should be obvious.  After all, there are no galaxy-sized objects that look like glaciers, trees or chipmunks.  Therefore if the Universe does possess fractal-like properties they must break down at some point.  Above those scales, does the Universe look like a fractal?  If so, does that fractal go on forever?  If not, where does it cut off.  Why?  How do we know?

These are the questions I investigate in this post.  Fair warning: this is about to get pretty wonky.  Those valiant enough to proceed are encouraged to put on their math caps.


One way cosmologists quantify structure is through a statistic known as the two-point correlation function (2PCF).  The 2PCF measures the probability of finding two galaxies separated by distance r beyond what’s expected through random chance.

In three dimensions the two-point correlation function is often approximated as a power law,

(1)   \begin{equation*}  \xi(r) \propto r^{-\gamma}, \end{equation*}

where \gamma is a parameter whose value depends upon the particular distribution of galaxies. In two dimensions the 2PCF w(\theta) is a function of angle,

(2)   \begin{equation*} w(\theta) \propto \theta^{-(\gamma-1)}. \end{equation*}

Note that if we add the number of Euclidean dimensions1The Universe possesses 3 Euclidean, or topological dimensions.  This is another way of saying we live a three-dimensional Universe – up/down, left/right, in/out.  We distinguish between Euclidean and fractal dimensionality since the latter can take non-integer values and more accurately describes fractals’ more complicated geometric properties. to the exponent of the 2PCF we obtain the same number, 3-\gamma. This is known as the codimension. It turns out that if you have a random process with a power law correlation function, when you project it into lower dimensions the codimension does not change.

To put more substance behind this, let’s consider the two-point galaxy correlation function in greater depth.  To compute its value at any r we populate a simulated volume with uniformly distributed2In this context “uniformly distributed” means the random points must have the same distribution the observed galaxies would in the absence of large scale structure. The geometry of the survey must therefore be taken into account. No random points may be placed in locations where galaxies could not be observed. If the number of observed galaxies decreases with distance as with magnitude-limited surveys, so too must the number of randoms. random points.  We count the number of pairs of points separated by each distance r and use the results to populate a so-called randoms-randoms histogram.  We do the same for the galaxies to generate a data-data histogram.  The ratio of these histograms, which is a measure of probability above and beyond what one would expect through random chance, is the 2PCF.3For more on this see Landy, S. D., & Szalay, A. S. 1993, Astrophysical Journal, 412, 64.

As an example consider a three-dimensional Universe in which all the galaxies lie along a straight line.  We limit our focus to galaxies separated by a distance r by imagining a spherical shell of radius r.  The only data-data points would lie across the shell from each other, perhaps located at opposite poles.  The number of galaxy pairs would scale as 2\lambda\, dr where \lambda is the linear galaxy density.  The random points could lie anywhere within the spherical shell, contributing to a much greater number of pairs.  The number of these pairs would scale as 4\pi r^2 \rho\, dr where \rho is the volume density of the randoms.44\pi r^2 is the surface area of a sphere.  When multiplied by the infinitesimal thickness dr it becomes the volume of a very thin spherical shell.  The correlation function would then go as

(3)   \begin{equation*} \xi=\frac{2\lambda \, dr}{4\pi r^2 \rho \, dr} \propto \frac{1}{r^2}=\left( \frac{1}{r} \right)^{\gamma=2}. \end{equation*}

By a similar argument if all the mass in the Universe was on a plane, then the number of data-data pairs would go as 2\pi r \sigma\,dr where \sigma is the galaxy area density.5A plane (of galaxies in this instance) intersected with a spherical shell creates a circular ring.  The circumference of that ring is 2\pi r.  When multiplied by the ring’s thickness dr we get the area of the ring.  In this case the correlation function would go as

(4)   \begin{equation*} \xi=\frac{2\pi r \sigma\,dr}{4\pi r^2 \rho \, dr } \propto \frac{1}{r}= \left( \frac{1}{r} \right)^{\gamma=1}. \end{equation*}

The codimension of the linear Universe is 3-\gamma=1.  The codimension of the planar Universe is 2.

The reason this matters is that a random process (like the distribution of galaxies) with a power law correlation function has a lot in common with fractals.6Though it might seem counterintuitive, the distribution of galaxies is considered to be a random process.  That is, there could be an infinite number of different Universes that each have the same 2PCF.  This is analogous to many people rolling a die a large number of times.  Each person will roll numbers 1 through 6 in a different order even though the probability of rolling each number is identical for all of them.  In fact, simulating the positions of galaxies is sometimes referred to as rolling the dice.  To see how, let’s examine the concept of dimensionality a bit more rigorously.

Imagine intersecting familiar geometric objects with a sphere and then doubling the radius of the sphere. What happens? If the object is a line, the length of the line inside the sphere will double. This means it increases by a factor of 2^{D=1}. If the object is a flat plane, the area of the plane inside the sphere will quadruple. This means it increases by a factor of 2^{D=2}. In these examples the exponent D tells you the dimensionality of the object. A line is 1-dimensional. A plane is 2-dimensional.7I have taken the radius of the sphere to increase by a factor of 2, but note that the argument works for any factor f, e.g. changing the radius of the sphere from R to fR scales the area of the intersecting plane by f^{D=2}.

While lines and planes are relatively simple objects, the boundaries of fractals are not.  In fact, the length around a fractal shape depends upon how fine a ruler one uses.  For example, consider the images of the United Kingdom’s coastline below.  The shoreline appears jagged on all scales and can be approximated to be a fractal.  As the resolution of the ruler increases, so too does the length of the coastline.  And because fractals have infinitely dense structure, the closer you look the longer the edge gets.  For this reason the edges of pure fractals are often considered infinite in length.

Britain-fractal-coastline-combined

When you intersect a fractal with a sphere and double its radius, the spatial content of the fractal doesn’t necessarily double or quadruple – it increases by a factor 2^{F_D} where F_D is known as the fractal dimension. And unlike in Euclidean geometry, the fractal dimension does not need to be an integer.8The fractal dimension is also a measure of the complexity of a fractal’s boundary. There are formal defintions of F_D, but those are omitted here.

It is somewhat comforting that F_D=1 for a straight line and F_D=2 for a flat plane, i.e. for simple cases the Euclidean and fractal dimensions are identical.  But if a line is somewhat curved, it will have a fractal dimension close to but greater than 1.  If a line is so tangled that it almost maps out an entire area, it will have a fractal dimension close to but less than 2. A similar logic applies to surfaces.  A slightly curved surface will have a fractal dimension somewhat larger than 2 while a surface so folded that it practically maps out the entire volume will have a fractal dimension somewhat smaller than 3.

The essential connection between these examples is that the codimension 3-\gamma and the fractal dimension F_D are actually measuring the same thing. A linear Universe has a codimension of 1 and the fractal dimension of a straight line is F_D=1. A planar Universe has a codimension of 2 and the fractal dimension of a plane is F_D=2.

This relationship is nontrivial. Dimensionality is a measure of how the spatial extent of a geometric form scales within a volume. The codimension is a measure of how objects are distributed relative to a purely random distribution. They are fundamentally different things, yet in the context of power law 2PCF they wind up being equal.

And while these are just the edge cases, this conclusion holds equally well for 1<\gamma<2.  In other words, if we know the two-point correlation function, we know the fractal structure of the Universe!

So if the Universe is indeed a fractal, what is its mass? The answer depends upon the radius R of the sphere within which we measure it. For a sphere centered on position x_0 we might use an equation like this,

(5)   \begin{equation*} m_R(x_0) = \int \rho(x) W_R(x_0-x) \, d^Nx, \end{equation*}

where \rho(x) is the density at position x, W_R(x_0-x) is a top-hat window function9The top-hat window function equals 1 when x is within a distance R of x_0 and equals zero otherwise. It exists to limit the integration to the interior of the sphere. and N is the dimensionality of the fractal10For conventional three-dimensional objects N=3. When integrating over surfaces we use N=2.. To find the average fractal mass within a radius R we would average m_R(x) over many positions.

Regardless of the particulars of the density function \rho(x), the mass of a fractal is proportional to length raised to the N^{\text{th}} power, or R^{F_D}. The mass density of a fractal therefore scales as

(6)   \begin{equation*}  \rho\left(R \right)=\frac{m_R}{V} \propto \frac{R^{D_F}}{R^3}=R^{D_F-3}. \end{equation*}

Experiments have shown that in our Universe,

(7)   \begin{equation*}  \xi\left(r\right) \propto \left( \frac{1}{r}\right)^{\gamma=1.8}. \end{equation*}

We might naively conclude from this that the fractal dimension of all space is D_F=1.2.  This lands close to the truth but misses an important point. When D_F<3, we have D_F-3<0. It therefore follows from equation 6 that as R \rightarrow \infty, \rho \rightarrow 0. In other words, the mean density of a fractal with D_F<3 is zero.

Our Universe has a nonzero density \rho, so something doesn’t quite fit. The explanation lies in the definition of the two-point correlation function. Recall that the 2PCF quantifies the probability of finding galaxies above what’s expected through random chance. If we represent the density of the Universe as the sum of a background component \rho_{bg} and a perturbative component \rho_{fr} above and beyond that of an expected background, we have

(8)   \begin{equation*} \rho(\mathbf{x})=\rho_{bg}+\rho_{fr}(\mathbf{x}). \end{equation*}

The density of the Universe is not what exhibits fractal properties. Rather, it is the density \rho_{fr}(\mathbf{x}) atop the background that does. Because \rho_{fr}(\mathbf{x}) is a perturbation from the mean, it has an expected value of zero when averaged over all space,

(9)   \begin{equation*} \langle \rho_{fr} \rangle_{\mathbf{x}} = 0, \end{equation*}

and thus satisfies the requirement that the mean density go to zero as R \rightarrow \infty.

I close with the following conclusion – the Universe does behave like a fractal as long as its two-point correlation function follows a power law relationship. Where the 2PCF fails to be modeled by equation 1, the equality between the codimension and fractal dimension no longer holds and the rest of the argument breaks down.11The approximation of the 2PCF as a power law works well for intermediate length scales. At small separations (e.g. the size of galaxies) the growth of structure is governed by factors far more complicated than simple gravity like supernovae, shockwaves, tidal forces, accretion disks, etc. At large separations parcels of matter are so distant that they have yet to have time to affect each other.

 

Featured image: “Stardust Memories” by Anua22a, used under CC BY-NC-SA 2.0 / Cropped from original
Britain fractal coastline imageoriginals left, middle and right made by Avsa mixed by Acadac, used under CC BY-SA 3.0

Notes   [ + ]

1. The Universe possesses 3 Euclidean, or topological dimensions.  This is another way of saying we live a three-dimensional Universe – up/down, left/right, in/out.  We distinguish between Euclidean and fractal dimensionality since the latter can take non-integer values and more accurately describes fractals’ more complicated geometric properties.
2. In this context “uniformly distributed” means the random points must have the same distribution the observed galaxies would in the absence of large scale structure. The geometry of the survey must therefore be taken into account. No random points may be placed in locations where galaxies could not be observed. If the number of observed galaxies decreases with distance as with magnitude-limited surveys, so too must the number of randoms.
3. For more on this see Landy, S. D., & Szalay, A. S. 1993, Astrophysical Journal, 412, 64.
4. 4\pi r^2 is the surface area of a sphere.  When multiplied by the infinitesimal thickness dr it becomes the volume of a very thin spherical shell.
5. A plane (of galaxies in this instance) intersected with a spherical shell creates a circular ring.  The circumference of that ring is 2\pi r.  When multiplied by the ring’s thickness dr we get the area of the ring.
6. Though it might seem counterintuitive, the distribution of galaxies is considered to be a random process.  That is, there could be an infinite number of different Universes that each have the same 2PCF.  This is analogous to many people rolling a die a large number of times.  Each person will roll numbers 1 through 6 in a different order even though the probability of rolling each number is identical for all of them.  In fact, simulating the positions of galaxies is sometimes referred to as rolling the dice.
7. I have taken the radius of the sphere to increase by a factor of 2, but note that the argument works for any factor f, e.g. changing the radius of the sphere from R to fR scales the area of the intersecting plane by f^{D=2}.
8. The fractal dimension is also a measure of the complexity of a fractal’s boundary. There are formal defintions of F_D, but those are omitted here.
9. The top-hat window function equals 1 when x is within a distance R of x_0 and equals zero otherwise. It exists to limit the integration to the interior of the sphere.
10. For conventional three-dimensional objects N=3. When integrating over surfaces we use N=2.
11. The approximation of the 2PCF as a power law works well for intermediate length scales. At small separations (e.g. the size of galaxies) the growth of structure is governed by factors far more complicated than simple gravity like supernovae, shockwaves, tidal forces, accretion disks, etc. At large separations parcels of matter are so distant that they have yet to have time to affect each other.
Personal

Bicycle Tragedy Hits Close to Home

On Saturday afternoon a cyclist was struck and killed immediately across the street from my house. As someone who cycles almost every day through the exact same location, this has affected me greatly. It is worth noting that my neighborhood is normally safe for joggers and cyclists. It’s rare for 15 minutes to pass without at least one passing by. We have wide streets and a newly paved, dedicated bike lane. And still this.

A few minutes ago I went across the street to the site of the crash to join about 40 other cyclists who had arrived for a vigil. The only sound to break the silence was the ocassional sniffle. Then, as if to bring everything that was wrong about this situation into focus, a car traveling 20mph over the speed limit raced passed our location.

A man immediately screamed, “Slow down! You are part of the problem!” This worked him up enough to continue. “Each day, every day, all of us go through this! All of us know how close we’ve all come to being hit.”

I couldn’t agree more.

If anything is to come from this tragedy (aside from the sensationalism that it was a bishop who struck a bike-maker), it should be a discussion about the relationship between cyclists and drivers on city roads. More often than not I hear drivers complain about the aggressive nature of cyclists. They drive too fast. They veer into driving lanes. They ride on streets with little to no shoulder. They don’t care about cars.

Cyclists, however, are risking their lives whenever they climb onto a bicycle. We contend with shoulders that are often too narrow and in disrepair. Even those in good condition are often littered with broken bottles, slippery pebbles and roadkill.  Cars whiz by at terrifying speeds, sometimes coming within a few inches of clipping me.  Were I swerve at just the wrong time to avoid hazards like a slitted sewer grate or fruit fallen from a tree, I could die.

I have had cars drive across a bike lane to make a turn without even realizing I was there. Within the last month I almost crashed into a car door because a parked driver neglected to check her mirror before opening it into a bike lane.

I can’t say whether I’m in the minority, but I cannot recall ever having a conversation about the relationship between cars and bicycles in Driver’s Ed. This seems such a shame because the two sides so frequently seem to be at odds with one another. Motorists complain that cyclists are entitled and reckless. Cyclists complain that motorists are dangerous and oblivious.

Earlier today I was speaking with a very nice woman who lives just down the road from me. As discussion turned to the accident she commented about another local road, “I just can’t believe cyclists drive up Falls Road. It’s so dangerous. They shouldn’t do that.”

I replied, “I bike Falls all the time. If you look at the signage it is dedicated bike route.”

She said, “But it’s a two lane road with such a small shoulder. And there are so many turns. It’s too risky for bikes to be there.”

“Not if cars are going the speed limit. And cyclists try to avoid major roads when possible. They aren’t exactly fun be on. But in this case if you want to get beyond the beltway, you have maybe 4 roads total and 2 are them are so dangerous they shouldn’t even be attempted.”

I’m pretty sure this was news to her.

I will continue to ride my bike. Since I own no car, I have little other option. But every day I do so I am placing my life in the hands of drivers who may have no idea what being on a bike is like.  I can only hope that the hundreds of candles and flowers adorning a lonely brick fence on Roland Avenue this freezing cold night will start to tell that story.

Pro Wrestling

Can Anyone Stop World Wrestling Entertainment?

wwe

I was asked the following question on Quora:

Will there ever be a company that can dethrone World Wrestling Entertainment as the worldwide leader in the wrestling industry?

It’s not impossible, but it is unlikely.

WWE possesses a number of institutional advantages that effectively preclude encroachment from competitors.  It is a publicly-traded, billion dollar company with a sizable amount of cash on hand.  It retains the ability to hire the best performers and staff.  Its revenue streams are relatively (for wrestling) diversified in that they profit from television rights fees, merchandise, movies, DVDs, books, magazines, its website, pay-per views and the WWE Network.  It would be difficult for any company starting from scratch to surpass that.

WWE is so entrenched as the “worldwide leader in sports entertainment” that when people think of pro wrestling, they think of WWE.  This is similar to people thinking of UFC when they think of MMA.  This is not a matter of WWE being a subset of pro wrestling – for many fans they are one in the same.  The last major competitor to WWE, WCW, went out of business in 2001.  For fans 18 and younger, this means WWE is pretty much the only wrestling company they’ve ever known.  Loyalty and familiarity will make it challenging for another company to usurp them.

Changing that mindset will take a tremendous amount of time, effort and money, all of which are unlikely to materialize.  As an example of this power, realize that WWE is able to leverage the infrastructure of cities to run their events.  If they need police escorts to get their buses from one place to another, they can get it.

WWE also possesses the most comprehensive pro wrestling video library in history.  In addition to its own conent, it also owns the footage from World Championship Wrestling (WCW), Extreme Championship Wrestling (ECW), the American Wrestling Association (AWA), World Class Championship Wrestling (WCCW) and others.  This allows it to leverage and monetize the entire history of the business.  No company will ever be able to match that institutional advantage.

Some might argue that WCW almost unseated WWE and another company might be able to do so again.  However, the competition from WCW was unique and unlikely to be duplicated.  WCW started as part of the old territory system in which pro wrestling was mostly local to each region.  When Ted Turner’s WCW became the primary territory for the National Wrestling Alliance (NWA) it already had a huge, well-established fan base.  It entered as an equal to the then-WWF, which offered it a unique position that no other company will ever be able to hold again.

I would further contend that the only reason WCW even came close to unseating WWE was that it attracted “fad-fans” interested in the novelty of the New World Order (nWo), i.e. a group of WWE wrestlers invading WCW.  These fans spiked viewership numbers during the late 1990’s but have not returned since.

Even if a smaller promotion like Total Nonstop Action (TNA), Ring of Honor (ROH), Evolve or Dragon Gate USA (DGUSA) is able to grow its own fanbase, history has shown these will likely remain niche products designed for a very specific wrestling-centric (as opposed to a more general entertainment-centric) audience.  Top stars from those promotions will consistently migrate to WWE for greater money and international exposure.  Fans of these smaller companies tend to also be fans of WWE, so in a sense they cannot even be considered true competitors.

The only plausible scenario in which WWE disappears from prominence is a hostile takeover or a complete buy-out.  A company like Disney, for example, could decide that pro wrestling fits into its business model and offer WWE an offer too good to refuse, especially if WWE’s product begins to flounder significantly.  At this point WWE would be absorbed into a larger conglomerate whose new owner could do with it as it wishes.

However in the near term, I would rate the chance of Vince McMahon, who is WWE’s primary stockholder, agreeing to cede control of his life’s work to someone outside of his own family as next to nil.  For his family to do so also seems unlikely given how deeply entwined their lives are with their business.

Personal

Bills Beat the Packers and I Learn How to Use Vine

Buffalo Bills Logo

The Buffalo Bills defeated the Green Bay Packers on Sunday in what many are calling the team’s biggest win in a decade.  Green Bay is considered a Super Bowl favorite.  They had been riding a 5 game winning streak and were winners of 9 of their last 10.  The Bills absolutely needed this game to keep hopes of ending their 14-year playoff drought alive.

They showed up in a big way.  Aaron Rodgers, considered by many to be the league’s best quarterback, was harassed by the Buffalo defense all afternoon and suffered through a career worst day.  On the game’s penultimate series the Packers had the ball deep in Buffalo territory down 6.  They needed to score a touchdown in less than 2 minutes and with no time outs.  Instead, Bills defensive end Mario Williams bumrushed Rodgers, sacked him and stripped the ball.  The referee called a safety on the field then this happened:

 

This historic victory also marks my very first Vine video!  Vine is designed to be used on your smartphone or tablet.  It isn’t optimized to accept video recorded in other places, like mine was (thanks Flipcam).  I learned that to import an edited video from programs like Adobe Premiere or Final Cut Pro one needs to output to 480 x 480 at 30 fps with mono sound.  The video must then be imported into your phone’s camera/video roll (I did this by emailing it to myself) from which it can then be uploaded to Vine.

By the way, this video was shot at The Rockwell, a bar in Fells Point, Baltimore.  It hosts the Charm City Bills Backers by exclusively airing their game on the big screen.  It’s almost always a fun time.

AstrophysicsPersonal

My Thesis Research

signal-with-and-without-noise

A lot of people ask me to describe my thesis research.  I used to give a complicated answer about using covariance matrices to perform a Karhunen-Loève transform to blah blah blah, but now I just say, “I clean cosmic data.”  Today I created a graphic that illustrates the essence of what I’m trying to do.

Notice how both the signal and noise have “structure”?  My research attempts to uncover those structures and use them to eliminate the noise.  The problem is that taking away noise also takes away signal.  So we need a way to “fill in the gaps.”

Now imagine doing this not for images, but for MASSIVE data sets…and you don’t get to know what the letters are beforehand. Solving this problem to high accuracy is a challenge.

Pro Wrestling

Do “Smart Marks” Help or Hurt the Wrestling Business?

This post was inspired by a question that was asked on Quora.

In some respects smart marks help the business and in others they hurt it. Their best quality is passion. They love wrestling and continuously invest their time and money to support it at all levels. They often comprise a majority of the audience at indy shows, so much so that many smaller promotions might not exist without them. These shows are critical to the business since they provide a stage upon which young wrestlers can hone their craft.

Smart marks can often be assumed as “a given.” Whether the business is hot or cold, they will always be there. On one hand, their constant baseline level of support prevents the floor from falling out of the pro wrestling business. On the other hand, this means they can be more easily taken advantage of since companies like WWE know they will buy whatever they put out.

However, there are some smart marks whose behavior can detract from the quality of a show. Some become a distraction

when they act as if they’re part of the event. For example, a handful with a vendetta against a performer can chant “boring” loud enough to ruin the experience for others.

They can also become jaded to the extent that nothing pleases them anymore. When the same fans attend a company’s shows over and over again, as with the TNA Impact Zone, their reactions die out because they’ve seen everything before. Once a company breaks free of them, as TNA did by leaving the Impact Zone, the shows can become more vibrant.

Other smart marks believe that good wrestling depends on how many spots wrestlers can work into a match. They think that if a wrester can’t perform a flipping powerbomb into a twisting springboard DDT 5 seconds into the match, he’s garbage. This mentality diminishes the impact of all moves and makes them mean less. (As a counterexample, see CM Punk’s piledriver on John Cena during their #1 Contender’s match for the WWE Championship at Wrestlemania 29 on February 25, 2013’s episode of Raw.)

I once heard a story about two ROH wrestlers working a frenetic style during a show. When they came backstage they encountered George South who suggested that they slow down and make individual moves matter more. The wrestlers said they couldn’t do that because the fans would chew them up for it. In effect, the reactions of a small group of fans had directly influenced (arguably negatively) the styles of wrestlers in the nation’s third largest promotion.

So with smart marks you get a mixed bag. In a sense, they are a critical component of the audience that supports young wrestlers and is usually willing to part with their money to support wrestling. At their worst, they can think that what they like matters more than anything else and any wrestler who doesn’t meet their conception of greatness isn’t worth the time of day.

ClimateEnergyPolitics

FOIA – We Are Making Progress

This is final part of a 5 part series on the government’s silence of silence and the Freedom of Information Act (FOIA).  Parts 1 through 4 can and should be read first:

Part 1: The Kingston Disaster
Part 2: The Government’s Silence of Science
Part 3: Freedom of Information Act to the Rescue?
Part 4: The Obama Failure

In brief, these articles describe how scientific research gathered by the United States government is often withheld from the general public, a type of action that can quite literally put lives at risk.  The Freedom of Information Act (FOIA) was passed to allow public access to these records, but both the George W. Bush and Obama administrations have so far failed to live up to the promise of the act.

But while there have been substantial challenges with gaining access to important public information, it’s not all doom and gloom.  The fact that we actually have a Freedom of Information Act with an appeals process and judicial review is significant.  The Act continues to have strong support in the NGO community.  A FOIAonline portal has been built with the goal of eventually becoming a one-stop shop for public information.  The Obama administration has taken a strong positive step at Data.gov to “increase public access to high value, machine readable datasets generated by the Executive Branch of the Federal Government.”  This initiative has already saved on infrastructure costs.

And we have had disclosure successes.  In 2008 the United States improved the Consumer Product Safety Act and created a searchable database for consumer information.  The National Oceanic and Atmospheric Administration’s (NOAA) National Climatic Data Center and EPA have done an admirable job of reporting on historical climate variables like temperature, precipitation and drought.  The US Embassy in Beijing has made electronic reports of air quality public when the Chinese government refused to do so.  The federal ENERGY STAR program labels the energy footprint of appliances to aid consumers in making more energy efficient purchases.

Inside federal agencies, it would appear that some progress is being made.  In 2013 UCS released a report entitled Grading Government Transparency in which they examined the ability of scientists at federal agencies to speak freely about their work.  They found that many agencies’ media policies “have shown significant improvement since 2008.”  In particular they note that scientists can now more easily apply their right to express personal views provided they make clear that they are not speaking for their agency.

This right was made considerably easier to exercise when on November 13, 2012, after an arduous 14 year journey, Congress unanimously passed the Whistleblower Protection Enhancement Act.  This act, for the first time, provides specific legal protection to scientists and other federal employees who expose censorship or suppression of federal research.  According to Celia Wexler of the Union for Concerned Scientists (UCS), “We hope that this law will begin a process to change the culture of federal agencies when it comes to whistleblowers. People who protect the public from unsafe drugs, tainted food, defective products, and environmental hazards should not fear for their jobs when they speak up for safety and scientific integrity.”

Since then, other steps have been taken to make it easier for the public to obtain government information.  On May 9, 2013 President Obama issued an executive order making open and machine readable data the new default for government information.  Citing examples like weather data and the Global Positioning System (GPS), the president argued that making federal data freely available “can help fuel entrepreneurship, innovation, and scientific discovery – all of which improve Americans’ lives.”

Then, on February 25, 2014 the US House of Representatives unanimously passed the FOIA Oversight and Implementation Act.  This amendment to the Freedom of Information Act would create a single, free website from which all FOIA requests could be made.  When requests are granted, federal agencies would have to release the information in an electronic and publicly accessible format.  When requests are denied, the appeals process would be streamlined.  The amendment also forces federal agencies to take greater responsibility for their FOIA obligations.

As we see, the system can work.  But there will always be disagreements between the public and federal agencies regarding which information should be disclosed through FOIA and which should be withheld for security reasons.  When public actors feel their claims have been rejected unjustly, they can always consider seeking subpoenas.

Absent that, there are other options at their disposal to extract greater value out of the information that is public.  Private technology companies can offer tools for the sharing and analysis of data.  Librarians can play a more prominent role in gathering and organizing documents.

When the information being disseminated is incorrect, knowledgeable scientists should take action.  They can start issue blogs and connect with members of the media.  Local groups like city councils rarely hear from scientists, so researchers can have an outsized impact in regional issues.  As members of one of the most respected professions, scientists would do well to build relationships with congressional representatives or their science staffers.  Failure to act means allowing dissembling voices fill the vacuum.

With respect to government disclosure, as with most things, the situation is neither entirely good nor bad.  But it is hard to deny that at times we Americans live in a perverse, ironic ecosystem – one in which taxpayers fund government research designed to inform and protect, only to have that same government deny us the results and claim it’s for our protection.  We must continue to hold our government accountable, push for transparency where appropriate and never yield to private interests who would use our ignorance against us.

EnergyPolitics

The Obama Failure

This is Part 4 of a 5 part series on the government’s silence of silence and the Freedom of Information Act (FOIA). Parts 1, 2 and 3 can and should be read first:

Part 1: The Kingston Disaster
Part 2: The Government’s Silence of Science
Part 3: Freedom of Information Act to the Rescue?

In brief, these articles describe how scientific research gathered by the United States government is often withheld from the general public, a type of action that can quite literally put lives at risk. The Freedom of Information Act (FOIA) was passed to allow public access to these records, but we discovered that for a number of reasons, the George W. Bush administration was overly eager to deny such requests.

Many of those critical of the Bush administration’s handling of FOIA requests hoped that the situation would improve under the Obama administration. In fact, one of the new President’s first actions in office was to issue the following instruction, essentially reversing the Ashcroft Memo:

All agencies should adopt a presumption in favor of disclosure, in order to renew their commitment to the principles embodied in FOIA, and to usher in a new era of open Government.

This memo was part of Obama’s Open Government Initiative, “committed to creating an unprecedented level of openness in Government.” Yet surprisingly, government transparency barely improved from the Bush administration and, according to some journalists, got worse. A full 30% gave him a grade of poor to very poor. Recently, OpenTheGovernment.org released an assessment saying that many sophisticated users of FOIA remain tremendously disappointed with the law’s implementation.

There are between 600,000 and 650,000 FOIA requests per year. While less busy agencies can respond within a few weeks, larger agencies like the Department of Defense are flooded with more requests than they have the resources to meet. In these cases, FOIA compliance has effectively become an unfunded mandate. The situation is worsened by the fact that many agency employees are overworked, undertrained, and generally unclear of their obligations under FOIA.

A Bloomberg News investigation last year set out to test the quality of current FOIA compliance. A team of reporters submitted the same FOIA request, for the travel expenses of top agency officials, to 57 agencies. 19 of 20 cabinet-level agencies did not comply within the mandated 20 day window. Even “well past that legal deadline,” about half of the agencies had still not fulfilled the request.

In some cases information is available, but in a form that mitigates its usefulness. Despite Obama’s promise of an online data repository, many information requests still need to be made in person. A significant number of records remain incomplete or redacted. Often data is not in a convenient format like PDFs or tables. Nonuniformity abounds. At the SEC there are different record systems in every department.

There is no uniform method to submit a FOIA request. Some agencies accept submissions by e-mail and others by fax. Some ask the user to complete a web form. Requests to the IRS must actually be sent by post.

Despite President Obama’s vow to “restore science to its rightful place,” scientists who wish to reach out to the public about their research findings were routinely prohibited by public affairs, removing the power of interpretation from data that rarely speaks for itself. They are often denied the right to review, prior to publication, the final versions of reports to which their names are attached or to which their research contributed. Even their ability to obtain access to drafts and revisions of such reports is limited.

The need for scientists to comment on their research is exemplified in the case of the Safe Water Drinking Act. Even though this act requires water utilities to “directly” issue customers water quality reports, the reports are often so technical as to be practically useless. (A water utility proposal to only issue the reports online would further disenfranchise those without Internet access.)

Even if adequate information is ultimately disclosed, delays can mitigate its usefulness. An environmental assessment of TransCanada’s controversial Keystone XL pipeline was criticized by many as giving insufficient consideration to its effects on the climate. The report’s integrity was further compromised when it was discovered that the authors had not only been previously employed by TransCanada, but had published a similarly positive assessment of a Peruvian liquified natural gas pipeline which has since racked up an abysmal environmental and social track record.

These and other concerns were meant to be addressed during a 45-day public comment period, but the State Department (which commissioned the report and has final say on the pipeline’s approval) declined to release those comments, a practice that is routine at other agencies. A FOIA request was submitted, but when an approval decision is expected in the “near term,” any delay in meeting the request can limit the public’s ability to meaningfully influence the outcome.

In fifth and final part of this series I describe how it’s not all doom and gloom! I will outline some of FOIA’s successes as well as highlight improvements that offer hope for the future.

EnergyPolitics

Freedom of Information Act to the Rescue?

This is Part 3 of a 5-part series on the government’s silence of silence and the Freedom of Information Act (FOIA).  Parts 1 and 2 should be read first and can be found here:

Part 1: The Kingston Disaster
Part 2: The Government’s Silence of Science

In brief, these articles describe the circumstances surrounding the rupturing of a coal fly ash containment pond in Roane County, Tennessee.  Government sponsored research that reported the health and environmental risks of such ponds was buried, redacted or otherwise hidden from public view.

Problems such as these were meant to be addressed by the Freedom of Information Act (FOIA).  Enacted in 1966, FOIA grants the public the legal right (also referred to as sunshine laws) to request information from the federal government.  It “provides that any person has a right, enforceable in court, to obtain access to federal agency records, except to the extent that such records (or portions of them) are protected from public disclosure.”

The spirit of FOIA embodies the essence of our American democracy.  We hold that a representative government by the people can, through its collective capacity, understand and prescribe solutions to threats against us.  We hold that a representative government for the people will utilize such knowledge for the security of its citizenry.  We hold that a representative government of the people will be served by the sacred trust we bestow upon our elected leaders.

Instead, we find that our government often defaults to the interests of a select few, frequently under the guise of security.  The precedent was codified in the 2001 Ashcroft Memo in which the then-Attorney General reassured agencies that their deliberations would remain confidential so long as they were “safeguarding our national security, enhancing the effectiveness of our law enforcement agencies, protecting sensitive business information and, not least, preserving personal privacy.”

Attorney General  Ashcroft concluded to the agencies, “When you carefully consider FOIA requests and decide to withhold records, in whole or in part, you can be assured that the Department of Justice will defend your decisions unless they lack a sound legal basis.”

Of course, there are many sensitive issues for which government secrecy is in the national interest.  But the Ashcroft Memo established a sweeping protection for agencies to deny data from the general public as long as they could make some argument about how disclosure would jeopardize law enforcement effectiveness, security, business or privacy.  Given that nearly every issue of import touches at least one of these four categories, FOIA requirements could essentially be ignored at the government’s discretion.

The Bush administration took full advantage of this latitude.  When a 2004 EPA study recommended that hydrofracking fluids, which are injected into the ground during the shale gas extraction process, be regulated under the 1974 Safe Drinking Water Act, then-Vice President Cheney intervened.  Using the business provision of the Ashcroft memo, Cheney had the study redacted by claiming it revealed “trade secrets.”

This secrecy has consequences.  When leaks and spills contaminate local streams and water supplies, scientists are limited in assessing the impacts.  Without knowledge of leaks’ chemical compositions regulation is difficult to justify and contamination is hard, if not impossible, to detect.  All of this serves to reduce the gas industry’s accountability for harms it might cause.  This attendant ambiguity made it easier to pass a provision in the 2005 Energy Policy Act that explicitly exempted fracking fluids from the Safe Water Drinking Act.

It is easy to imagine other circumstances in which the “trade secrets” clause could prove dangerous.  If a train, truck or barge carrying hazardous, but classified, materials were to crash, the secrecy exemption could put first responders in grave risk.

Sometimes, the government decides that even admitting records exist will damage national security or lead to stigmatization.  This justifies the so-called “Glomar response” which allows agencies “to neither confirm nor deny” (read: ignore) FOIA requests.  The Department of Justice, the agency responsible for FOIA enforcement, has broadly supported this right on numerous occasions.

While about 70 countries have their own forms of FOIA, many are plagued by similar issues.  Ireland allows easier access to documents, but many remain unsigned which reduces accountability.  Israel does have an appeals process, but such a request can take years and there are no real penalties for non-compliance.  Even in the European Union, which tends to be more open, the scope of the right remains unclear partly because of the governments’ unwillingness or outright failure to clarify the issue.

In part 4 of this series, we will examine how the widespread hope offered by President Obama’s Open Government Initiative has largely gone unmet.