**1/31/13 update:**Many thanks to David Wiltshire for clarifying some points in the comments, and for addressing critics of his ideas. -EC]

According to today’s mainstream science, about 71% of the Cosmos is made of something that scientists have never seen, detected, directly observed, or even indirectly observed. It’s “dark energy,” first described in the late 1990s as a repulsive or anti-gravitational property of empty space. By measuring supernovae in distant galaxies, astronomers inferred that not only is the universe getting larger, the expansion is getting faster with time. Physicists started speculating about what might be responsible for the acceleration, and the newly minted concept of dark energy quickly became the accepted explanation. In fact, today it’s built into the standard cosmological model of a universe that expands with an ever-accelerating rate.

Dark energy has also gone mainstream: As viewers of popular-science TV programs know well, in the very distant future, the repulsive effects of dark energy may become so dominant that they tear apart galaxies, stars, planets — eventually even the atoms that currently make up your body — in a highly dramatic “big rip” that marks the end of the universe.

From the outset, the concept of dark energy seemed to me like a bad idea. We had a set of observations we didn’t fully understand, and in response, we externalized the problem onto some unseen but supposedly real agent “out there” that must be to blame. It didn’t sound like the best approach. It seemed very possible that we misinterpreted the supernova measurements, so why the rush to invent a brand new form of energy (which would need to comprise three-quarters of the universe!) to explain this interpretation? Surely, no one can say that all other alternative explanations had been exhausted. Inventing dark energy was a bit like losing your homework assignment, and then promptly deciding that your family must have an invisible and undetectable dog who ate it.

It turns out that in 2007, David Wiltshire of New Zealand’s University of Canterbury explained how the apparently accelerating expansion of the universe could be an illusion due to our perspective in the Milky Way. According to Einstein’s general theory of relativity, a clock can run at different rates depending on the local strength of gravity; near a massive star, for example, a clock will run more slowly than an identical clock positioned far away. From the perspective of an observer close to the star, everything (including the nearby clock) will appear to be running normally, while the distant clock seems to run

*fast*; to an observer near the clock in deep space, meanwhile, the local time will appear to pass normally as usual, but from that perspective, the gravitationally bound clock seems to run

*slow*. Gravity’s effect on time is universally accepted. If your GPS receiver didn’t account for general relativity, your navigation directions would lose accuracy in minutes.

*all of the rest of the universe*appears to be expanding away, at an accelerating rate. Similarly, if you lived in a galaxy on the other side of a large void, the Milky Way would appear to be accelerating away from you. You would be living in a Hubble bubble there, too.

The first time I read about Wiltshire’s insight, my reaction was disbelief: “Surely they must have thought of that!” But apparently, they hadn’t. The equations that cosmologists use to study the universe’s large-scale structure are extremely difficult to work with, so some simplifying assumptions have been made in order to construct workable models. One of these is to describe space everywhere as a uniformly expanding fluid. This assumption has been a part of the mathematics of cosmology (in the so-called Friedmann equations) since 1922, before we even knew there

*were*other galaxies beyond our own. Given the brutally difficult equations, which could not otherwise be solved, the Friedmann solutions have been a part of cosmology bedrock ever since. However, we now know that the large-scale structure of the universe is not at all uniform; there are walls of galaxy clusters separated by enormous voids, so if Einstein is correct, the universe is

*not*a uniformly expanding fluid. Time flows, and therefore expansion happens, faster in some places than in others.

Wiltshire argues that a necessarily uneven expansion of the universe previously hadn’t been considered because (1) physicists are used to considering relativistic effects on time only in the extreme conditions of black holes and particle-accelerator experiments, and (2) they have assumed that such effects in intergalactic space would be extremely weak. They are indeed weak, but Wiltshire has done the math to show how the cumulative effects become huge over billions of years and across vast distances of space.

According to Google Scholar, Wiltshire’s original paper has been cited 123 times to date; it’s certainly not being ignored. There appear to be few or no papers that refute the concept of non-uniform expansion due to relativistic effects. In fact, most of the papers build upon the idea.

So why, in 2013, is “dark energy” still a thing? Beats me! In his excellent book

*The Trouble With Physics*, Lee Smolin describes how modern physics can unwittingly construct opaque, monstrous theories that become intellectual dead-ends with no practical uses, but which nevertheless persist for sociological and financial reasons. First, a few innocent working assumptions are laid down, and then models are built on those assumptions, and then others build upon those models, and so on — until there’s a mini-industry of researchers working in the field and collecting funding, with the original assumptions rarely re-examined. This is what has happened with dark energy: Even though we now have a viable explanation for the appearance of cosmic acceleration, a bad alternative has become entrenched. Careers have been forged based on the cosmological model that includes dark energy. And in a potential embarrassment for the ages, the 2011 Nobel Prize in Physics was awarded for what the prize committee called “the discovery of the accelerating expansion of the universe.” Dark energy is too big to fail!

Beyond academia, the public never should have been asked to have faith in a mysterious, undetectable force that’s out there but at the same time all around us. Invoking such ideas makes it impossible to maintain the clear distinctions between religion and science, at least in the public eye. But even today, the History Channel tells us “we now know” that most of the universe is made of mysterious dark energy. NASA does the same, on a public-education page called “Universe 101.” Houston, we have a problem.

Perhaps there are physicists who secretly hope the dark-energy concept will gradually die out. But in popular science, which is usually well behind the curve, the myth of dark energy will undoubtedly linger for years on the websites and the TV programs. We deserve better.

Very interesting. I like the tie in with Smolin's argument. One thing though, you may want to edit out the comparison of the popularity of dark energy with the institution of slavery. It's kind of like comparing healthcare reform to Nazi death panels.

ReplyDeleteYou're right, thanks. I didn't intend to make a direct comparison. I was referring to the way some people reacted against slavery by doing nothing, which was just another way of enabling it. But the analogy is unhelpful.

DeleteThis is an interesting opinion article. But we do know that there is no such thing as "empty space" in the universe due to quantum level fluctuations. This is essential for the Standard Model of Particle Physics to work. David Wiltshire seems to be proposing an alternative interpretation of Einstein's cosmological constant. That's fine, but it is also an opinion piece without empirical evidence to back up his claims. In other words, he suggests swapping one religion for another.

ReplyDeleteDidn't Einstein consider the cosmological constant his biggest mistake, when Hubble expansion was discovered? I believe the cosmological constant was reintroduced only when the supposed acceleration was discovered, only then it was called dark energy. Wiltshire's idea would take it away again. I'm pretty sure particle physics worked before 1998; if particle physicists have to go back to the drawing board because they drank the dark energy Kool-Aid, that's their problem.

DeleteI think Einstein's "biggest mistake" refers to him fudging his equations to force a steady-state universe using a cosmological constant, rather than the cosmological constant itself.

DeleteBut I do see your point... The evidence for an accelerating universe could indeed be interpreted as apparent acceleration as proposed by Wiltshire. At a first glance this seems like a true simplification... but to me there still seems to be a problem.

It turns a blind eye to what we have learned from Quantum theory. It can be experimentally shown that even empty space contains energy (e.g. casimir effect). Small amounts of energy and momentum over enormous areas of space, it's got to have an effect on the universe.

Einstein considered it his "Biggest Blunder" that he tried to make his equations fit with his bias that the universe should be static (not expanding/contracting over time) by introducing a universal constant into the equations (which was perfectly valid mathematically). When we discovered the universe's expansion was actually accelerating, nonzero CC was reintroduced into the Field Equations.

DeleteIf you know any calculus, the CC is analogous to a constant of integration.

Ed, you really shouldn't have such strong opinions about topics you don't really understand all that well. You're coming off like a creationist trying to disprove evolution even though he/she clearly doesn't understand it. You appear to have thrown your hands up and yelled "well that settles it, checkmate Dark Energy." That's not how science works.

ReplyDelete"Dark Energy" is just a synonym for "Cosmological Constant," which is a "tunable" constant (meaning we "tune" it to fit observation). The CC (Cosmological Constant) appears in the Einstein Field Equations, which tell us how curvature couples to the energy/momentum distribution in spacetime. There is no real physical reason to assume that the CC should be zero. General Relativity by itself does not (and cannot) tell us what the value of the CC is. In other words, it's more of an assumption to assume that Dark Energy *doesn't* exist than that it does, because you are ad hoc assuming one of GR's parameters to be an arbitrary value (zero).

Now Wiltshire's idea is certainly interesting, but it still doesn't eliminate the possibility of nonzero CC. Indeed you could consider a generalized Wiltshire solution in which the CC is nonzero, in which case both the relativistic effects of an inhomogeneous universe *AS WELL AS* Dark Energy are present.

It's also still unclear whether or not the universe is homogeneous at large scales, though data seems to suggest that it is. We know it's highly isotropic, and if you take this with the assumption that the Copernican Principle holds (which we know it does to a high degree of accuracy) then homogeneity follows suit.

Wiltshire's model isn't without other criticisms, namely *THAT IT APPEARS NOT TO FIT OBSERVATION*:

"These all paint a consistent picture in which voids are in severe tension with the data. In particular, void models predict a very low local Hubble rate, suffer from an 'old age problem', and predict much less local structure than is observed."

http://arxiv.org/abs/1007.3725

"It also rules out the best-fitting gigaparsec-scale void models, posited as an alternative to dark energy."

http://iopscience.iop.org/0004-637X/730/2/119/fulltext/

Now hold on, there's been some interesting points made in this article. Also your little dig on understanding rubs off like a preschooler calling a baby stupid. No one really understands this topic that well.

DeleteA big part of this essay is to poke holes at the "conclusion" that there must be dark energy when it has never been observed or measured. That is to say, dark energy is up there with string theory and religion if it can't provide a testable hypothesis. But the popular media and popular science (literally) is labelling it as though it's a fact. Even particle physicists weren't so arrogant when they were exploring for the higgs boson, hoping that there was another explanation... but I digress.

The cosmological constant is actually quite irrelevant to the whole idea that was put forth. It's like when christians say that the bible should be interpreted and cherry picked, a physical formula with a "cosmological constant" can give any interpretation as to the end of the universe you desire. To justify it on the basis of mathematical principles is also wrong, it could just as easily be a formula error or at the very least an oversimplification of the real universe.... which is pretty much what all maths is. A generalisation. The +c spitting out of the end of integration is not proof of anything.

"Also your little dig on understanding rubs off like a preschooler calling a baby stupid. No one really understands this topic that well."

DeleteI hope you're joking. Ed clearly doesn't understand the topic he's trying to influence people's opinions on. His view is entirely ignorant and misleading.

The description for this article is: "We have a better explanation now, so it's time to end this faith-based myth." This is so wrong it's borderline rage-inducing. Not only is the conclusion absurd, but so is the premise: Wilshire's model likely conflicts with known data.

No one know *for fact* what the right answer is, but some are definitely more adept (because of education, self study, whatever) at addressing these issues. Putting in a few hours of layman-level research and then claiming an open question in physics has been resolved is beyond stupid.

"A big part of this essay is to poke holes at the 'conclusion' that there must be dark energy when it has never been observed or measured."

Nobody is claiming to know *for sure* that there is a vacuum energy (though it would be very strange if there weren't), so your point is a strawman. We *think* there is because we know general relativity is accurate at large scales, and this is the best model that follows naturally from GR.

As for independent evidence of Dark Energy, Sean Carroll says it best:

"One simple argument is “subtraction”: the cosmic microwave background measures the total amount of energy (including matter) in the universe. Local measures of galaxies and clusters measure the total amount of matter. The latter turns out to be about 27% of the former, leaving 73% or so in the form of some invisible stuff that is not matter: “dark energy.” That’s the right amount to explain the acceleration of the universe. Other lines of evidence come from baryon acoustic oscillations (ripples in large-scale structure whose size helps measure the expansion history of the universe) and the evolution of structure as the universe expands."

"Even particle physicists weren't so arrogant when they were exploring for the higgs boson, hoping that there was another explanation... but I digress."

You're making it seem like there's some Dark Energy religion in the science community. There isn't. This is science - if an idea is wrong then it's wrong. People get prizes for proving accepted science wrong/incomplete. If something is the consensus in the science community, it's likely that there's a good reason why. In this case it's because Dark Energy is currently the best model for large-scale cosmology. If you disagree, it's probably because you haven't done enough research (or are not equipped to understand it). A better model might come along to replace DE, but so far none has been successful.

"The cosmological constant is actually quite irrelevant to the whole idea that was put forth."

The argument was that Wiltshire's model provides a way to explain observational data with zero Cosmological Constant (read: no Dark Energy), therefore DE models are obsolete. I provided several arguments for why Ed was "jumping the gun" so to speak.

The Cosmological Constant *is* Dark Energy, so I'm not really sure how you could possibly claim they are disconnected.

"a physical formula with a "cosmological constant" can give any interpretation as to the end of the universe you desire."

Sure. But only one value fits observation. To use the calculus analogy: say we have the function f(x)=x. The integral of this function is F(x)=½x²+C, where C is some constant. You can choose C to be any value you want and it will describe an equally valid solution to the equation f(x)=dF/dx. But what if we know that F(1)=3/2 ? (Maybe this value was provided by experiment/observation, for example.) Then the only value of C which satisfies this condition is C=1, so our solution to the equation x=dF/dx is the function F(x)=½x²+1.

(PART 1)

"To justify it on the basis of mathematical principles is also wrong,"

DeleteBetter tell that to every physicist. We build mathematical models to fit observation; that's what physics *is*! We know GR applies to certain scenarios very accurately, so it isn't really too far of a stretch to assume it also applies in areas that haven't been directly tested yet.

"it could just as easily be a formula error or at the very least an oversimplification of the real universe"

No. The Einstein Field Equations including nonzero Cosmological Constant are the simplest, most general metric theory of gravity you can write down. This is easily seen by looking at its Lagrangian:

L=k(R-2Λ)sqrt(-g)

Not to include the CC makes the theory less general.

"which is pretty much what all maths is. A generalisation. The +c spitting out of the end of integration is not proof of anything."

You should've told that to Dirac when he predicted the existence of antiparticles with his silly old math. Or Maxwell when he predicted that light consisted of electromagnetic waves. Or Einstein when he predicted time dilation/length contraction/relativity of simultaneity, that light is affected by gravity, etc. Or (recently) Peter Higgs when he predicted the Higgs boson.

We build our theories to fit observation, and then we extrapolate them to predict new phenomena. The math guides us into the unknown. We *technically* could have predicted an accelerating universe since 1916, but everyone ad hoc assumed Λ=0.

To belittle the math because you don't like it is absurd. This is best left off with some Feynman:

"...and then there's the kind of thing which you don't understand. Meaning 'I don't believe it, it's crazy, it's the kind of thing I won't accept.'

I hope you'll come along with me and you'll have to accept it because it's the way nature works. If you want to know the way nature works, we looked at it, carefully, that's the way it works.

You don't like it? Go somewhere else!

To another universe! Where the rules are simpler, philosophically more pleasing, more psychologically easy. I can't help it! OK! If I'm going to tell you honestly what the world looks like to human beings who have struggled as hard as they can to understand it, I can only tell you what it looks like.

And I cannot make it any simpler, I'm not going to do this, I'm not going to simplify it, and I'm not going to fake it. I'm not going to tell you it's something like a ball bearing inside a spring, it isn't.

So I'm going to tell you what it really is like, and if you don't like it, that's too bad."

http://www.youtube.com/watch?v=iMDTcMD6pOw .

(PART 2/2)

I am not terribly bothered by the insults on my opinion piece. What does bother me is that tomorrow's scientists are willing to elevate a 15-year-old theory to sacred-cow status, suggest that said theory is on a level with biological evolution, and demand extraordinary theoretical and empirical evidence to challenge said theory. I am bothered that detracting ideas such as Wiltshire's are approached in forums such as this practically with a debunking attitude. Regardless of my or anyone else's opinion on dark energy, this is exactly what would happen if a fundamentally incorrect theory became too big to fail. The problem would eventually self-correct -- good science always beats bad science -- but not before a lot of talented people wasted a lot of their time doing dead-end work, and not before the public is sold a false bill of goods. That's my real concern here.

DeleteForgive my continued ignorance, which must be endlessly tiresome, but minds superior to mine are not pleased that the vacuum energy density predicted by QFT is over 100 orders of magnitude greater than the tiny constant used in cosmology. What's up with that? This is nowhere close to settled science, so when pop-sci outlets suggest that it is, that's a far greater disservice than anything I have written.

"I am not particularly bothered by being insulted for having written an opinion piece."

DeleteI apologize if my posts came off as overtly rude. I was certainly being harsh, but I believe for good reason. It's dangerous and irresponsible to make matter-of-fact claims like "we have a better theory now" and that DE is a "faith-based myth," especially for someone with a following as large as yours.

"What does bother me is that tomorrow's scientists are willing to elevate a 15-year-old theory to sacred-cow status, compare said theory to biological evolution, and demand extraordinary theoretical and empirical evidence to challenge said theory."

But nobody *is* doing this! Perhaps science media (which should never be trusted by the way) is portraying things in an misleading or irresponsible way; I don't know. Questioning the consensus and trying to find alternatives is a healthy and common practice, especially in developing fields like Cosmology. Your premise is that science has dogmatically accepted a certain idea and will not consider alternatives. The fact that papers like Wiltshire's exist (and are cited, as you mentioned, hundreds of times) is evidence that the exact opposite is true.

We have good reason to think DE exists. Even a quick glance at the wikipedia article should convince you of this:

http://en.wikipedia.org/wiki/Dark_energy#Evidence

"The total amount of matter in the universe (including baryons and dark matter), as measured by the CMB, accounts for only about 30% of the critical density. This implies the existence of an additional form of energy to account for the remaining 70%.[10]"

"The theory of large scale structure, which governs the formation of structures in the universe (stars, quasars, galaxies and galaxy clusters), also suggests that the density of matter in the universe is only 30% of the critical density."

"The data confirmed cosmic acceleration up to half of the age of the universe (7 billion years), and constrain its inhomogeneity to 1 part in 10.[15] This provides a confirmation to cosmic acceleration independent of supernovas."

Perhaps most significantly:

"This so-called late-time Integrated Sachs-Wolfe effect (ISW) is a direct signal of dark energy in a flat universe.[16]"

"I am bothered that detracting ideas such as Wiltshire's are approached in forums such as this practically with a debunking attitude."

It essentially *has* been "debunked," as you call it. If a theory doesn't fit the data then it is wrong. There's no saving a theory once it has shown to conflict with observation. I'm sorry that you don't like it, but that's the way science goes. People sometimes spend their entire careers developing theories that don't pan out. Nature isn't sympathetic to our efforts - she is what she is.

(PART 1/2)

" Regardless of my or anyone else's opinion on dark energy, this is exactly what would happen if a fundamentally incorrect theory became too big to fail."

DeleteI personally have no stake in the matter, so I believe I'm approaching this from a relatively unbiased vantage point. I'm not doing research on DE or anything of the sort, so I wouldn't be particularly bothered if some other idea overshadowed DE.

Dark Energy just "looks" right to me. The "missing" 70% of the universe's energy is just right to explain the universe's acceleration in a way that follows quite naturally from General Relativity. Combined with supplementary independent evidence that seems to confirm DE's existence, it's a tough theory to beat.

"The problem would eventually self-correct -- good science always beats bad science -- but not before a lot of talented people wasted a lot of their time doing dead-end work, and not before the public is sold a false bill of goods. That's my real concern here."

You're approaching this from the perspective that DE is wrong, and that it's only a matter of time before we figure out why. All evidence supports the idea that DE exists, so it isn't likely that we'll find contradictory evidence any time soon. But if we did, so be it. We can only learn about nature by exploring all possibilities, and right now DE seems to be the most likely possibility.

"Forgive my continued ignorance, which must be endlessly tiresome, but minds superior to mine are not pleased that the vacuum energy density predicted by QFT is over 100 orders of magnitude greater than the tiny constant used in cosmology. So what's up with that?"

I have no idea. I've yet to study QFT - the extent of my knowledge is currently reserved to General Relativity and non-relativistic Quantum Mechanics. Sean Carroll gave an interesting talk about this (which I believe is still on youtube somewhere). It's still an open question.

(2/2)

If there had been a "CMB catastrophe" where 70% of the universe went missing, and *then* we discovered the supernova evidence, that would be compelling indeed. As it stands, the references in the article are all post-2000. An actual prediction of cosmic acceleration would have been more valuable than any number of well-fitted, post-supernova retrodictions.

DeleteSo what's this about Wiltshire's solution not matching the observed data? The paper you linked doesn't cite Wiltshire at all as far as I can tell. He never claimed that "we occupy a privileged position near the centre of a large, nonlinear, and nearly spherical void" -- quite the opposite actually.

http://www2.phys.canterbury.ac.nz/~dlw24/universe/

"If there had been a "CMB catastrophe" where 70% of the universe went missing, and *then* we discovered the supernova evidence, that would be compelling indeed. As it stands, the references in the article are all post-2000. An actual prediction of cosmic acceleration would have been more valuable than any number of well-fitted, post-supernova retrodictions."

DeleteI'm not sure I understand what you're saying. You don't accept the evidence because you don't like when it was discovered? That's rather strange.

"So what's this about Wiltshire's solution not matching the observed data? The paper you linked doesn't cite Wiltshire at all as far as I can tell. He never claimed that "we occupy a privileged position near the centre of a large, nonlinear, and nearly spherical void" -- quite the opposite actually."

Wiltshire's idea is not a new one. Kolb et al produced similar papers a few years earlier which received a lot of attention (i.e. http://arxiv.org/abs/astro-ph/0506534). Wiltshire essentially did what Kolb did with an exact solution to the EFE's rather than approximations.

The papers I cited show that void models (Wiltshire's is a void model) are in tension with known data. Wiltshire actually commented on the second of the two:

"The value of the dressed Hubble constant is also an observable quantity of consider-

able interest. A recent determination of H0 by Riess et al. [55] poses a challenge for the

timescape model. However, it is a feature of the timescape model that a 17–22% vari-

ance in the apparent Hubble ﬂow will exist on local scales below the scale of statistical

homogeneity, and this may potentially complicate calibration of the cosmic distance lad-

der. Further quantiﬁcation of the variance in the apparent Hubble ﬂow in relationship to

local cosmic structures would provide an interesting possibility for tests of the timescape

cosmology for which there are no counterparts in the standard cosmology."

I believe you also misunderstood Wiltshire's paper. The apparent acceleration is explained by supposing "Earth were at or near the center of a very low-density region of interstellar space (a relative void)... This situation would provide an alternative to dark energy in explaining the apparent accelerating universe.[3]"

http://en.wikipedia.org/wiki/Hubble_Bubble_(astronomy)

My article is based on Wiltshire's webpages linked above, which are based on this paper:

Deletehttp://arxiv.org/abs/0809.1183

Neither have anything to do with Earth being at the center of a void. I guess I must be either crazy or stupid.

I think you're just misunderstanding what "center of a void" means. It means that we lie in a privileged section of universe (our galaxy) surrounded by voids (the space between galaxies). Basically, refer to your image above (our galaxy surrounded by void).

DeleteThen you still haven't explained why the model is disproved by observation. The big arrows in my picture refer to other matter accelerating away, and voids beyond that. No privileged position is given for our location. If all of the voids are expanding faster than all of the bound regions, then we would appear to be in a Hubble bubble, would we not?

DeleteHis model produces a prediction for Hubble flow that's in tension with observation, among other problems. As I mentioned above, Wiltshire himself has acknowledged this.

DeleteIt looks like we're done now. You didn't show how Wiltshire's model requires Earth to be in a privileged position. You also never explained why the appearance of accelerated expansion

Deletewouldn'tbe a necessary observation from Earth if general relativity is correct. Instead you went from declaring the model falsified beyond consideration, to identifying one "tension" between the model and a recent calculation -- while shrugging off 107 orders of magnitude of disagreement between dark energy and a much better-supported theory, quantum field theory! The reasoning? Because we have some calculations from the CMB, which in turn are based on the same FLRW solution of homogeneous expansion that led to this mess in the first place. And, because dark energy "just looks right to me." Unwittingly, your comments exemplify what I mean by too big to fail. This thread is closed.Thanks Eddie. This is the first I've heard of this interesting idea. Looking at his site, David seems quite credible to me.... but as someone who is not a working cosmologist I am not really qualified to judge!

ReplyDeleteIt's safe to say that dark energy is not yet proven beyond reasonable doubt, which is why alternative models are actively discussed in the community. However, by focusing on just this one paper and then claiming that it completely solves the issue is a bit too simplistic. A bit like homeopathy proponents who latch onto that single clinical study that somehow could be interpreted as lending credence to their claims. So now we have an alternative explanation. Good. What predictions does it make that distinguish it from the dark energy hypothesis? Can we test those?

ReplyDeleteIMHO it is perfectly reasonable to postulate: "Suppose the extra expansion is caused by some extra energy. What properties would it need to have to do so" and then go from there to build testable hypotheses.

Note that quantum field theory not only (seemingly?) disagrees with dark energy. It also disagrees with general relativity. This is to be expected, since one is developed to describe microscopic length scales while the other is responsible for astronomical length- and mass scales. One of them, QFT or GR, will have to be further modified to get them to agree. The problem is that to date we do NOT have a satisfying theory of quantum gravity, with or without dark energy.

Proposing alternatives to dark energy for explaining the real or apparent acceleration of the universe's expansion are definitely not ignored by the scientific community.

Fair enough; thanks for the comment. Is it true that QFT disagrees with GR? Or just that gravity has not yet been successfully quantized? In the mid 1800s electricity and magnetism hadn't been unified, but the theories didn't necessarily disagree (as far as I know). QFT on the other hand makes a density-energy prediction that greatly disagrees with the notion of a small cosmological constant, which is dark energy all the way.

DeleteSorry, I'm not sufficiently knowledgeable on that topic, but yes, if I did understand my prof way back correctly, then GR and QFT don't really work together. I'm not really surprised, though: GR describes gigantic masses and astronomical length scales, whereas quantum field theory describes the smallest of all length scales. The problem is that to make sense of field theories, one has to introduce certain energy cut-offs to prevent certain integrals from diverging. Depending on where you set this cut-off, you get wildly disagreeing predictions...

DeleteI'd expect that mistakes happen when extrapolating one of QFT or QR into the realm of the other, just as classical mechanics fails at high velocities. When attempting to unify them, at least one of them will have to be modified. In that case, I'd place my bet that quantum mechanics will survive relatively unscathed.

But to get really thorough answers I can recommend the site physics.stackexchange.com where I know that a lot of really knowledgeable people post quite a lot, including top professors in the fields of cosmology.

Dear Eddie Current,

ReplyDeleteThanks for your email bringing my attention to this discussion, and for your clearly written piece. While I do not have a lot of time to discuss every point, I am happy to make some comments. I will do this under the following headings. I have to break it up as the software does not allow me more than 4096 characters

1. Distinction of timescape from non-Copernican "void" models

2. New physics in the timescape cosmology

3. Observational tests, anomalies and tensions

4. Sociology of science in challenging a standard model

1. DISTINCTION OF STATISTICAL AVERAGING FROM NON-COPERNICAN VOID MODELS

DeleteYour correspondent Elfmotat has confused my model with non-Copernican large void models criticized by Moss, Zibin and Scott, Phys. Rev. D83 (2011) 103515 = arXiv:1007.3725. The timescape model is not such a model, is not subject to the criticisms made in this paper, and is not ruled out by any other observations.

I am certainly not the first person to suggest that inhomogeneities rather than dark energy may be at the root of what appears as apparent cosmic acceleration (just the exact mechanism by which this is achieved is very different to anyone else - see next section). Elfmotat mentions Rocky Kolb who with collaborators generated publicity and controversy in 2005, with an argument based on perturbation theory about a homogeneous cosmology. But in fact people like Marie-Noelle Celerier and Kenji Tomita had already suggested inhomogeneities as an alternative to dark energy, using non-Copernican void models in 2000.

The large void models are the most well studied in this field, because there is a simple exact solution for the Einstein equations with any spherically symmetric pressureless dust source, which goes back to Lemaitre and Tolman in 1933. The Lemaitre-Tolman (or Lemaitre-Tolman-Bondi LTB) models are of course unlikely philosophically, as well as observationally, since they require us to be near the centre of a universe with a very peculiar spherically inhomogeneous density profile - generally with us near the centre of a large void (though not necessarily, see arXiv:0906.0905) - thereby violating the Copernican principle. Furthermore, the void (or other spherical inhomogeneity) has to be much larger than the typical structures we observe.

What we actually observe is not a single large void but a complex cosmic web of structures, with particular characteristic scales to the inhomogeneity which are much smaller than the toy model large LTB voids that that Moss, Zibin and Scott discredit. Surveys show that about 40% of the volume of the late epoch universe is in voids of 30/h Megaparsecs (where H0= 100 h km/s/Mpc is the Hubble constant). A similar fraction is in smaller voids, while clusters of galaxies are contained in filaments that thread the voids and walls that surround them in a cosmic web. The universe is homogeneous in some statistical sense when one averages on scales larger than about 100/h Mpc. These are observed facts which almost everybody agrees on.

The actual inhomogeneity is far too complex to be solved in Einstein's equations, even on a computer. In the standard model one assumes that the universe evolves as if were a completely homogeneous uniform fluid with Newtonian gravity perturbations added on top. Since the 30/h Mpc voids are in the "nonlinear regime" (smaller than the statistical homogeneity scale) as far as perturbation theory is concerned, the only way to understand them in the standard model is to run computer simulations with Newtonian gravity + dark matter + dark energy + the uniformly expanding background. Full general relativity is not used; that problem is in the too-hard basket. It's not just a question of computing power - there are fundamental ambiguities about splitting space and time in general relativity which impact on the numerical problem. People have recently learned how to do the two body problem (e.g. 2 black holes) in numerical relativity; the cosmological problem is far more complex.

What I do is fundamentally different, in that I begin from an approaches pioneered by Thomas Buchert, in which one considers the statistical properties of a truly inhomogeneous matter distribution with regions of varying density and varying expansion rates in the fully nonlinear regime. So I do not solve Einstein's field equations directly, but rather Buchert's equations which describe the average evolution of a statistical ensemble of voids and wall regions. Rather than postulating a hypothetical single large void to have a simple exact solution of Einstein's equations, I am dealing with an approach which accounts for the actual observed inhomogeneity in a statistical sense within general relativity.

Delete2. NEW PHYSICS IN THE TIMESCAPE COSMOLOGY

Going from a statistical description of cosmology to one which refers to our own observations requires additional input. It's like having the ideal gas law on one hand, and then having to use that law to explain physics from the point of view of a molecule of gas. So I certainly add new physics; the crucial physics relates to the relative calibration of rulers and clocks of observers in regions of different density which have decelerated by different amounts.

General relativity is actually not a complete theory. On account of the equivalence principle, gravitational energy is not localizable, and a conserved energy cannot be defined in general. The best one can do is "quasilocal energy"; but the nature of quasilocal energy expressions is something that Einstein struggled with, and many mathematical relativists since, with no general consensus as there are no unique conservation laws. Since the universe is so remarkably simple despite the complex cosmic web of different densities, I think it likely that there are nonetheless simple principles to be found which shed light on this fundamental, but unsolved, part of general relativity. I proceed on that basis.

I do not have time/space to describe my from-first-principles reasoning here. I wrote an FQXi essay a while ago which describes the ideas qualitatively, also available as arXiv:0912.4563. In particular, I treat the relative deceleration of regions of different density - which impacts on the relative kinetic energy of the expansion of space - as a physical degree of freedom in the calibration of clocks. So while the idea that "clocks run slower in denser regions" is familiar in solar system physics or near black holes, what I am dealing with is not the familiar relative position of observers in a static potential but something intrinsically different. It's a new effect.

Since I am proposing a new effect, it's not a case of this being something obvious in general relativity that we just forgot to account for. If that was the case this would be much more readily accepted by theoretical physicists. Rather I am extending the basic physical ideas of relativity to a new regime - the complex many body problem of cosmological averages - where we know there are unsolved problems.

The gravitational energy gradient / clock effect is essential for my solution to the problem of dark energy; when you put the numbers in it would not work otherwise. That is the reason why after toying with different names I chose to call it the "timescape cosmology" to highlight the essential new physics and distinguish my work from other approaches to inhomogeneity, such as large void models.

3. OBSERVATIONAL TESTS, ANOMALIES AND TENSIONS

Delete"Anonymous" very correctly says that alternatives such as mine are taken seriously, and what matters are observational tests. I have detailed several observational tests in Phys. Rev. D90 (2009) 123512 = arXiv:0909.0749. Some of these involve things which are already tensions for the standard model.

"Elfmotat" mentions that I have pointed out a tension between my dressed Hubble

constant, and the current value of Riess et al; I can live with something in

the range up to 68 km/s/Mpc (or down to 58 km/s/Mpc) but 72 km/s/Mpc would

be too high. I should point out, however, that there are also many observations which provide tensions, or indeed outright anomalies, for the standard Lambda CDM cosmology.

In cosmology it is sometimes said that any model which fits all of the data at any one time is almost certainly wrong because some of the data is going to change. For example, the standard model has the primordial lithium abundance anomaly, which has existed since the first release of WMAP data in 2003 and which has not gone away. This one is actually very interesting for me, since when I put the numbers in it is very likely that the timescape model can deal with the lithium abundance anomaly. (This happens because the ratio of baryons to dark matter is naturally increased for given baryon-to-photon ratio.) For the standard model the lithium abundance anomaly is statistically much more significant than the Hubble constant tension in my case - which is the difference between a "tension" and an "anomaly".

The standard model is accepted despite some outright anomalies, and various other tensions, simply because it fits so many independent pieces of data. Of course, the standard model has been developed over decades with thousands of people working on it, and so it can be subject to many more tests than any other cosmological model. To claim better than standard model on the primordial lithium abundance anomaly, for example, I have to be able to fit all of the Doppler peaks in the CMB anisotropy spectrum to more tightly constrain the baryon-to-photon ratio.

Thus far with the CMB I have only fit the overall angular scale of the Doppler peaks (which itself is a major test). To do the rest is hard because one cannot simply use existing numerical codes written for the standard model, developed over a decade by many people. One has to revisit the whole problem from first principles. That sort of thing takes time. Similarly, techniques used to analyse galaxy clustering statistics are very much based on the standard model; so again to do the tests rigorously requires revisiting the whole observational methodology.

The interesting thing about the timescape cosmology is that quantities such as the luminosity distance are so close to those of the standard model that the differences are at the level of systematic errors. Supernova distances are the test we have studied most, as they are the simplest. Even in this simple case the fact that supernovae are not purely standard candles but standardizable candles means that the systematics have to be sorted out before one can say whether Lambda CDM or timescape is better (see arXiv:1009.5855). There are two main methods of data reduction - SALT/SALT-II and MLCS2k2 - if you use the first method you would say Lambda CDM fits better, and if you use the other then you would say timescape fits better.

It boils down to empirical questions such as "is the reddening by dust in other galaxies similar to that of the Milky Way?". I find that it should be in order for timescape to come out better, and indeed independent measurements in other galaxies confirm that the Milky Way value reddening parameter R_v=3.1 is within a standard deviation of the mean R_v=2.8. Yet some supernova fits to the standard cosmology give R_v=1.7, which is way off. It is these sorts of nitty gritty issues that crop up again and again when you try to do any observational cosmology. Astrophysics is basically a dirty empirical subject where you cannot control the conditions of the lab. To distinguish two competing models on many independent tests therefore requires years of work. Some of the most definitive tests I discuss in arxiv:0909.0749, such as the redshift-time drift test, which actually take decades to do.

DeleteThe tension in the value of the dressed Hubble constant is actually interesting because in the timescape model one expects a natural variance of up to 17-22% in the nearby Hubble flow below the scale of statistical homogeneity. This may also impact on the assumptions made in normalizing the distance ladder, which is crucial to determining H0. So there are important systematic astrophysical issues to be resolved. The most interesting tests are in fact nearby, below the scale of statistical homogeneity, where the universe is most inhomogeneous. We have begun looking at this.

Indeed in arXiv:1201.5371 we do a model-independent test, with a finding that is very much at odds with the assumptions in the standard model. The spherically averaged Hubble flow appears to be significantly more uniform in the rest frame of the local group, rather than the assumed rest frame of the CMB. The only way to understand this as far as I can see is that there appears to be a significant non-kinematic component to the CMB dipole due to foreground density gradients from the differential expansion of space; indeed we find a dipole whose strength is correlated with structures at the same scales that generate the spherical (monopole) variation in the Local Group frame. Such a finding supports the timescape model but does not prove it. On the other hand, some astronomers are beginning to collect data for far more ambitious tests of the timescape model: see arXiv:1211.1926.

In short, observational tests are being undertaken. It just takes time.

4. SOCIOLOGY OF SCIENCE IN CHALLENGING A STANDARD MODEL

ReplyDelete"Anonymous" is very correct to point out that alternatives to dark energy are seriously considered in the community, and that your portrayal - based largely on the popular account - is simplistic. However, the issue of acceptance of new ideas by the scientific community is a complex one, so the possibility of something being "too big to fail" (sociological inertia) deserves further discussion. The "too big to fail" standard here is not dark energy itself, but the Friedmann-Lemaitre models which have been around since the 1920s and upon which a huge superstructure has been built absorbing many decades of many people's careers. That is a lot to challenge; and not something you do lightly.

The timescape model is testable, and it actually fits the current data so well that the differences between timescape and Lambda CDM will require years of careful analysis to resolve. But the pace of doing this work is not only limited by the time it takes to do observations. There is also the fact that I and my students are the only ones working on developing the theoretical ideas and analytic tools to enable the timescape cosmology to be better tested. My work is certainly cited, but just because people find it interesting not because they are working on it. The attitude of my colleagues is most often: "that's an interesting idea, we'll wait and see". Of course there are those who simply do not believe me as well.

Most scientists generally only read in depth those papers which are going to help them write their next paper. Our careers are built on producing papers to pass peer review. To do something totally new requires a big investment of time, and most people want to be convinced that something is going to work before they make that investment.

In theoretical physics, researchers will quickly take to a new topic if they can adapt their existing tools, and produce papers quickly. The reason that large voids and other LTB models are the most well-studied inhomogeneous models, with many many papers written despite being extremely unlikely as physical models of the whole universe, is that they are exact solutions of Einstein's field equations to which existing techniques can be applied. It requires no hard conceptual thinking to produce new papers; just an ability to solve differential equations and a familiarity with the tools of general relativity.

Anything which involves truly new physics is hard to sell, since there are always a lot of ideas on the market. In my case I am going to the fundamentals of hard problems that troubled Einstein and many mathematical relativists since. I am thinking conceptually and using observations as a guide to new physical principles. In reality, the "shut up and calculate" school still dominates. A lot of theoretical physicists shy away from conceptual thought; and like to solve hard mathematical problems within the confines of an accepted theoretical framework. I have given an invited lecture at a big conference and had a famous relativist come up to me afterward with the comment "congratulations, this is the true spirit of relativity" or on another occasion "this is the way that physics should be done". But very few people actually dare to do physics this way; it's too much of a gamble.

I am taking the gamble because I am convinced that the mystery of dark energy (and perhaps even dark matter too) does demand really new physics which was not in the toolbox of tricks that we had invented by the end of the 20th century. And as a relativist I know Einstein never finished the job. I am abandoning the idea that spacetime is a single geometry with a prescribed matter field. Instead I think cosmological spacetime describes a statistical ensemble of geometries, within which the notions of gravitational energy and entropy have to be better defined. It involves new symmetry principles (which I have tried to encapsulate with a "cosmological equivalence principle"). Such symmetry is no doubt related to the initial conditions of the universe and all those thorny questions about quantum gravity. But rather than trying to tackle those problems head on with purely mathematical ingenuity, I think we have to be guided by observations - which often means sorting out very prosaic astrophysical issues to get to the bottom of things.

DeletePhysicists who talk about "modified gravity" have generally proceeded from a point of view in which all modifications come about from changing the action, while keeping the geometry simple. Indeed a lot of present day thinking in cosmology is really still Newtonian (actions and forces), with a simple Euclidean space lurking in the background. (From my point of view distances on cosmological scales - the 30/h or 100/h Mpc - are a convention adapted to one particular geometry. In a statistical description, when spatial curvature varies greatly and there is no one fixed metric - there are different equivalent metric descriptions. It is all about relative calibration of rulers and clocks.) So while the program I am pursuing is not "modified gravity" in the way it has traditionally been done, i.e., it does not change general relativity on the scale of bound systems on which it is tested, I would be happy to call it "modified geometry".

This is new and uncharted territory. But if "dark energy" is the fundamental mystery that everyone says it is then any real lasting solution is going to be new uncharted territory at some point. You cannot have it both ways. It is entirely reasonable of my colleagues to "wait and see", since I might be wrong. But I am of the view that being a theoretical physicist is not about necessarily being right or wrong, but rather about having the courage to go right back to the conceptual foundations when the observations demand it, and to rigorously ask questions that have never been asked before.

Very interesting discussion. I am not a researcher in physics but still based on my previous knowledge reached the conclusion the dark energy is hocus-pocus. Yesterday I had posted a thread in physics forum and some people got upset:

ReplyDeletehttp://www.physicsforums.com/showthread.php?t=669099

They didn't seem too upset -- I've seen a lot worse! I love the physicsforums as a resource. Whenever I have a question about established mainstream physics, and I want the established explanation for something or I want to learn the mainstream perspective, I go to the physicsforums. But it's a terrible place to challenge conventional wisdom or to float an original hypothesis. As eager as the regulars are to help someone with an honest question, they're just as eager to jump on someone who seems to have an agenda or an anti-science bent. Humility, I've learned, serves you well there. Thanks for the comment.

DeleteI don't understand the attack on Eddie or Wiltshire. I personally don't think anything is to big to fail in science, but there might be some ideas that may take more time to let go of. I'm all for new discoveries and explanations and wish anyone searching for truth good luck! Even if Wiltshire can not confirm his idea, it is possible that something else might get discovered by looking at the universe from a different perspective. Thanks

ReplyDeleteI thank you for taking the time to respond David. It appears I was too quick to judge your idea. Based on previous experience with void models, I assumed your idea (which, after skimming through a couple of your papers, looked vaguely similar at first) was going to run into similar problems.

ReplyDeleteI'll read your papers more closely. At the moment I'm still of the opinion that the ΛCDM seems right, but perhaps that will change.

Thanks again for the knowledgeable response.

I am not a physicist, but I do have real interest in this topic. I am wondering how David Wiltshire's timescape cosmology deals with the Integrated Sachs Wolfe Effect. I would add that the concept of "dark energy" has bothered me from the time I first heard about it, as it seems very messy and a violation of Occam's Razor. In contrast, a universe perfectly balanced between eternal expansion and eventual contraction seems elegant.

ReplyDeleteWe will make a new approach for an effect known as “Dark Energy” by an effect on gravitational field.

ReplyDeleteIn an accelerated rocket, the dimensions of space towards movement due to ‘Lorentz Contraction’ are on continuous reduction.

Using the equivalence principle, we presume that in the gravitational field, the same thing would happen.

In this implicates in ‘dark energy effect’. The calculi show that in a 7%-contraction for each billion years would explain our observation of galaxies in accelerated separation.

Lorentz Contraction

If we suppose that gravitational field contracts the space around it (including everything within), we can explain the accelerated separation from galaxy through this contraction without postulating ‘dark energy’.

The contraction of space made by gravity would cause a kind of ‘illusion of optic’, seem like, as presented below, that galaxies depart fastly.

The contraction of space would be equivalent to relativistic effect which occurs in a special nave in high-speed L.M.: With regard to an observer in an inertial referential stopped compared to a nave, the observer and everything is on it, including own nave, has its dimension contracted towards to movement of nave compared to a stopped observer (Lorentz Contraction).

This means that the ‘rule’ (measuring instruments) within the nave is smaller than the observer outside of moving nave.

The consequence is, with this ‘reduced rule’, this moving observer would measure things bigger than the observer would measure out of nave.

An accelerated rocket and its continuous contraction

In the same way, if we think of an accelerated increasing speed rocket, its length towards movement – compared to an inertial reference - will be smaller, and 'rule' within the nave will decrease continuously compared to this observer.

We would think of 'equivalence principle' to justify that gravitational field would have the same effect on 'rules' (measuring instruments) as an accelerated rocket would do within the nave, but, now, towards all gravitational field and not, in the case of rocket, only at acceleration speed.

I.e., the gravitational field would make that all rules within this field would be continuously smaller regarded to an observer outside of gravitational field and this would make, as we can see, these observers see things out of field be away fastly.

The “dark energy” through gravitational contraction:

Let’s think what would happen if a light emitted by a star from a distant galaxy would arrive into our planet:

Our galaxy, as well as distant galaxies, would be in continuous contraction, as seen before, due to gravity.

A photon emitted by a star from this distant galaxy, after living its galaxy, would go through by an “empty” big space, without so much gravitational influence, until finally arrives into our galaxy and, lastly, to our planet.

During this long coursed way (sometimes billion years), this photon would suffer few gravitational effect and its wavelength would be little affected.

However, during this period, our system (our rules) would still decreasing due to gravitational field, and when this photon finally arrives here, we would measure its wavelength with a reduced ‘rule’ compared to what we had had at the moment when this photon was emitted from galaxy.

So, in our measurement would verify if this photon had suffered Redshift because, with reduced rule, we would measure a wavelength longer than those was measured. The traditional explanation is “Shift for Red” happened due to Doppler Effect compared to galaxy separation speed!