Friday, January 25, 2013

Dark Energy: Too Big To Fail

[1/31/13 update: Many thanks to David Wiltshire for clarifying some points in the comments, and for addressing critics of his ideas. -EC]

According to today’s mainstream science, about 71% of the Cosmos is made of something that scientists have never seen, detected, directly observed, or even indirectly observed. It’s “dark energy,” first described in the late 1990s as a repulsive or anti-gravitational property of empty space. By measuring supernovae in distant galaxies, astronomers inferred that not only is the universe getting larger, the expansion is getting faster with time. Physicists started speculating about what might be responsible for the acceleration, and the newly minted concept of dark energy quickly became the accepted explanation. In fact, today it’s built into the standard cosmological model of a universe that expands with an ever-accelerating rate.

Dark energy has also gone mainstream: As viewers of popular-science TV programs know well, in the very distant future, the repulsive effects of dark energy may become so dominant that they tear apart galaxies, stars, planets — eventually even the atoms that currently make up your body — in a highly dramatic “big rip” that marks the end of the universe.

From the outset, the concept of dark energy seemed to me like a bad idea. We had a set of observations we didn’t fully understand, and in response, we externalized the problem onto some unseen but supposedly real agent “out there” that must be to blame. It didn’t sound like the best approach. It seemed very possible that we misinterpreted the supernova measurements, so why the rush to invent a brand new form of energy (which would need to comprise three-quarters of the universe!) to explain this interpretation? Surely, no one can say that all other alternative explanations had been exhausted. Inventing dark energy was a bit like losing your homework assignment, and then promptly deciding that your family must have an invisible and undetectable dog who ate it.

It turns out that in 2007, David Wiltshire of New Zealand’s University of Canterbury explained how the apparently accelerating expansion of the universe could be an illusion due to our perspective in the Milky Way. According to Einstein’s general theory of relativity, a clock can run at different rates depending on the local strength of gravity; near a massive star, for example, a clock will run more slowly than an identical clock positioned far away. From the perspective of an observer close to the star, everything (including the nearby clock) will appear to be running normally, while the distant clock seems to run fast; to an observer near the clock in deep space, meanwhile, the local time will appear to pass normally as usual, but from that perspective, the gravitationally bound clock seems to run slow. Gravity’s effect on time is universally accepted. If your GPS receiver didn’t account for general relativity, your navigation directions would lose accuracy in minutes.

Wiltshire’s insight was that light from the supernovae traverses huge stretches of space where there is little to no matter, so clocks there would run fast as judged by observers near galaxies, such as ourselves. When astronomers study supernovae on the other side of these voids, the measurements are affected by the fact that the voids are expanding faster than the space closer to us (as judged here on Earth). According to Wiltshire’s calculations, if you lived in one of these voids, you’d judge the universe to be about 18 billion years old — several billion years older than how things appear here, where matter and gravity slow clocks as well as the expansion of space. The end result is that the Milky Way resides in a “Hubble bubble,” a local region beyond which all of the rest of the universe appears to be expanding away, at an accelerating rate. Similarly, if you lived in a galaxy on the other side of a large void, the Milky Way would appear to be accelerating away from you. You would be living in a Hubble bubble there, too.

The first time I read about Wiltshire’s insight, my reaction was disbelief: “Surely they must have thought of that!” But apparently, they hadn’t. The equations that cosmologists use to study the universe’s large-scale structure are extremely difficult to work with, so some simplifying assumptions have been made in order to construct workable models. One of these is to describe space everywhere as a uniformly expanding fluid. This assumption has been a part of the mathematics of cosmology (in the so-called Friedmann equations) since 1922, before we even knew there were other galaxies beyond our own. Given the brutally difficult equations, which could not otherwise be solved, the Friedmann solutions have been a part of cosmology bedrock ever since. However, we now know that the large-scale structure of the universe is not at all uniform; there are walls of galaxy clusters separated by enormous voids, so if Einstein is correct, the universe is not a uniformly expanding fluid. Time flows, and therefore expansion happens, faster in some places than in others.

Wiltshire argues that a necessarily uneven expansion of the universe previously hadn’t been considered because (1) physicists are used to considering relativistic effects on time only in the extreme conditions of black holes and particle-accelerator experiments, and (2) they have assumed that such effects in intergalactic space would be extremely weak. They are indeed weak, but Wiltshire has done the math to show how the cumulative effects become huge over billions of years and across vast distances of space.

According to Google Scholar, Wiltshire’s original paper has been cited 123 times to date; it’s certainly not being ignored. There appear to be few or no papers that refute the concept of non-uniform expansion due to relativistic effects. In fact, most of the papers build upon the idea.

So why, in 2013, is “dark energy” still a thing? Beats me! In his excellent book The Trouble With Physics, Lee Smolin describes how modern physics can unwittingly construct opaque, monstrous theories that become intellectual dead-ends with no practical uses, but which nevertheless persist for sociological and financial reasons. First, a few innocent working assumptions are laid down, and then models are built on those assumptions, and then others build upon those models, and so on — until there’s a mini-industry of researchers working in the field and collecting funding, with the original assumptions rarely re-examined. This is what has happened with dark energy: Even though we now have a viable explanation for the appearance of cosmic acceleration, a bad alternative has become entrenched. Careers have been forged based on the cosmological model that includes dark energy. And in a potential embarrassment for the ages, the 2011 Nobel Prize in Physics was awarded for what the prize committee called “the discovery of the accelerating expansion of the universe.” Dark energy is too big to fail!

Beyond academia, the public never should have been asked to have faith in a mysterious, undetectable force that’s out there but at the same time all around us. Invoking such ideas makes it impossible to maintain the clear distinctions between religion and science, at least in the public eye. But even today, the History Channel tells us “we now know” that most of the universe is made of mysterious dark energy. NASA does the same, on a public-education page called “Universe 101.” Houston, we have a problem.

Perhaps there are physicists who secretly hope the dark-energy concept will gradually die out. But in popular science, which is usually well behind the curve, the myth of dark energy will undoubtedly linger for years on the websites and the TV programs. We deserve better.

Monday, January 14, 2013

Why The Speed Of Light Is Constant

When many people first learn that the speed of light is constant — that it’s the same everywhere in the universe, and is measured the same by all observers — it all seems kind of arbitrary. It’s strange enough that the speed of light is a “cosmic speed limit,” beyond which no object can go. But, how can the speed of anything possibly be a constant value? When I’m in an airplane and I walk forward up the aisle, I am going faster than the plane itself, something that could be verified by an observer on the ground. My total speed, relative to the ground, will be the speed of the plane plus my walking speed. (Subtract the speed if I’m walking in the other direction, toward the tail.) Yet, if I measure the speed of a light beam coming from a laser aboard a plane or a spaceship, I’ll always get the same result of 186,000 miles per second, whether the beam is facing forward, backward, or sideways, or even if it’s beamed from the spaceship in any direction and measured on the ground. Why?

It actually has to be this way. A world in which the speed of light varied might be impossible logically (see below), or at least could be so messed up and incoherent that complex structures such as galaxies and intelligent life might not be possible.

Consider a thought experiment: You set up a rapidly rotating beacon in deep space, with a radio antenna on it. (Radio waves are an invisible form of light.) The antenna is transmitting a radio signal consisting of a sequence of natural numbers: 1, 2, 3, and so on. Far away from the transmitter, you tune in your receiver and wait for the beacon to start spinning and transmitting the sequence. What kind of signal will you receive?

That depends on whether the speed of light is constant or not. Let’s imagine that it isn’t. In that case, the speed of the radio signal in your direction would change according to the velocity of the transmitting antenna, as with the example of walking up the aisle of an airplane: As the beacon spins, sometimes the antenna would be coming toward you — which would “throw” the signal faster in your direction — and sometimes it would be moving away, which would subtract some miles per hour from the signal’s speed. So, parts of the signal would be traveling toward you faster than other parts. Naturally, if the speed of different portions of the signal is different, the faster portions will reach you sooner than the slower portions. This effect would worsen the farther away you are from the beacon, and the faster the beacon is spinning. You might end up receiving a steady sequence of numbers like this:

1 - 2 - 3 - 7 - 8 - 9 - 4 - 5 - 6 - 10 - 11 - 12 - 16 - 17 - 18 - 13 - 14 - 15 ...

Here, the 7 - 8 - 9 segment is arriving sooner than the 4 - 5 - 6 segment, even though it was actually transmitted later. If this were a television signal rather than a sequence of numbers, with the antenna on a large rotating disc, the program would be really messed up. A political candidate might be seen giving a concession speech, followed by a speech where he seems confident he’ll win. I’m not sure that causal impossibilities would result from incoherent information flying around the universe,* but surely the formation of large structures such as galaxies, responding gravitationally to far-away moving bodies, would be affected (if gravity would even exist in such a universe).

Now let’s consider what happens in a world where the speed of light is constant. In this case, all portions of the signal are transmitted at the same speed relative to you, even as the beacon rotates, so no portion reaches you faster than any other. The numbers arrive in the correct sequence, just as they were transmitted. However, the only way this can possibly work (as Einstein showed) is if measurements of time and distance change for various observers. For a transmitting antenna on a rotating beacon, this produces a relativistic Doppler effect — a slowing down and speeding up of the signal, almost like a vinyl record being played back off-center, something like this:

1 ---- 2 --- 3 -- 4 - 5 - 6 -- 7 ---- 8 --- 9 -- 10 - 11 - 12 -- 13 ---- 14 --- 15 ...

If the TV antenna on the disc were transmitting American Idol, you’d see the show from beginning to end without interruption — no unexpected spoiler — but the slowing down and speeding up might make it sound as if the singer and the band can’t stay on key to save their lives (which in reality happens sometimes).

As it turned out, the constant speed of light was confirmed by the Dutch astronomer Willem de Sitter in 1913, through observations of double star systems, where two stars are rotating around each other closely. He realized that if the speed of light varied as the stars advanced toward us or receded away, the orbits would appear erratic. The system might appear blurry or scrambled and incoherent, and from great distances the laws of motion would appear not to work at all. However, such was not the case in any system that de Sitter observed, and this was used as evidence to support Einstein’s special theory of relativity and the constancy of the speed of light. All observations made to date support the same conclusions.

Personally, I think that the constant speed of light is a hint that the universe is fundamentally informational (the “it from bit” hypothesis proposed by the great physicist John Wheeler). This idea says that matter/energy and spacetime emerge from a deeper layer of existence that’s based purely on information. In our universe, information in the form of light always passes from point A to point B in a coherent, sequential fashion, for every observer — it’s space and time (and even mass) that all change accordingly to suit it. There might be a lesson there.

* I haven’t been able to come up with a paradox of cause and effect, resulting just from some information traveling faster than other information. In the days when some news came by rail and some by telegraph, you might have heard that Lincoln’s assassin had been caught before you heard that Abe had been shot — but nobody went back in time to kill John Wilkes Booth. Still, I suspect there is a thought experiment that would show it couldn’t work. If you have any ideas, please comment.