Guest Post: On the Real Inhomogeneous Universe and the Weirdness of ‘Dark Energy’

A few weeks ago, I mentioned a paper by a colleague of mine, Mohamed Rameez, that generated some discussion. Since I wasn’t up for commenting on the paper’s scientific content, I thought it would be good to give Rameez a chance to explain it in his own words, in a guest post. Here’s what he has to say:

In an earlier post, 4gravitons had contemplated the question of ‘when to trust the contrarians’, in the context of our about-to-be-published paper in which we argue that accounting for the effects of the bulk flow in the local Universe, there is no evidence for any isotropic cosmic acceleration, which would be required to claim some sort of ‘dark energy’.

In the following I would like to emphasize that this is a reasonable view, and not a contrarian one. To do so I will examine the bulk flow of the local Universe and the historical evolution of what appears to be somewhat dodgy supernova data. I will present a trivial solution (from data) to the claimed ‘Hubble tension’.  I will then discuss inhomogeneous cosmology, and the 2011 Nobel prize in Physics. I will proceed to make predictions that can be falsified with future data. I will conclude with some questions that should be frequently asked.

Disclaimer: The views expressed here are not necessarily shared by my collaborators.

The bulk flow of the local Universe:

The largest anisotropy in the Cosmic Microwave Background is the dipole, believed to be caused by our motion with respect to the ‘rest frame’ of the CMB with a velocity of ~369 km s^-1. Under this view, all matter in the local Universe appear to be flowing. At least out to ~300 Mpc, this flow continues to be directionally coherent, to within ~40 degrees of the CMB dipole, and the scale at which the average relative motion between matter and radiation converges to zero has so far not been found.

This is one of the most widely accepted results in modern cosmology, to the extent that SN1a data come pre ‘corrected’ for it.

Such a flow has covariant consequences under general relativity and this is what we set out to test.

Supernova data, directions in the sky and dodgyness:

Both Riess et al 1998 and Perlmutter et al 1999 used samples of supernovae down to redshifts of 0.01, in which almost all SNe at redshifts below 0.1 were in the direction of the flow.

Subsequently in Astier et al 2006, Kowalsky et al 2008, Amanullah et al 2010 and Suzuki et al 2011, it is reported that a process of outlier rejection was adopted in which data points >3$\sigma$ from the Hubble diagram were discarded. This was done using a highly questionable statistical method that involves adjusting an intrinsic dispersion term $\sigma_{\textrm{int}}$ by hand until a $\chi^2/\textrm{ndof}$ of 1 is obtained to the assumed $\Lambda$CDM model. The number of outliers rejected is however far in excess of 0.3% – which is the 3$\sigma$ expectation. As the sky coverage became less skewed, supernovae with redshift less than ~0.023 were excluded for being outside the Hubble flow. While the Hubble diagram so far had been inferred from heliocentric redshifts and magnitudes, with the introduction of SDSS supernovae that happened to be in the direction opposite to the flow, peculiar velocity ‘corrections’ were adopted in the JLA catalogue and supernovae down to extremely low redshifts were reintroduced. While the early claims of a cosmological constant were stated as ‘high redshift supernovae were found to be dimmer (15% in flux) than the low redshift supernovae (compared to what would be expected in a $\Lambda=0$ universe)’, it is worth noting that the peculiar velocity corrections change the redshifts and fluxes of low redshift supernovae by up to ~20 %.

When it was observed that even with this ‘corrected’ sample of 740 SNe, any evidence for isotropic acceleration using a principled Maximum Likelihood Estimator is less than 3$\sigma$ , it was claimed that by adding 12 additional parameters (to the 10 parameter model) to allow for redshift and sample dependence of the light curve fitting parameters, the evidence was greater than 4$\sigma$ .

As we discuss in Colin et al. 2019, these corrections also appear to be arbitrary, and betray an ignorance of the fundamentals of both basic statistical analysis and relativity. With the Pantheon compilation, heliocentric observables were no longer public and these peculiar velocity corrections initially extended far beyond the range of any known flow model of the Local Universe. When this bug was eventually fixed, both the heliocentric redshifts and magnitudes of the SDSS SNe that filled in the ‘redshift desert’ between low and high redshift SNe were found to be alarmingly discrepant. The authors have so far not offered any clarification of these discrepancies.

Thus it seems to me that the latest generation of ‘publicly available’ supernova data are not aiding either open science or progress in cosmology.

A trivial solution to the ‘Hubble tension’?

The apparent tension between the Hubble parameter as inferred from the Cosmic Microwave Background and low redshift tracers has been widely discussed, and recent studies suggest that redshift errors as low as 0.0001 can have a significant impact. Redshift discrepancies as big as 0.1 have been reported. The shifts reported between JLA and Pantheon appear to be sufficient to lower the Hubble parameter from ~73 km s^-1 Mpc^-1 to ~68 km s^-1 Mpc^-1.

On General Relativity, cosmology, metric expansion and inhomogeneities:

In the maximally symmetric Friedmann-Lemaitre-Robertson-Walker solution to general relativity, there is only one meaningful global notion of distance and it expands at the same rate everywhere. However, the late time Universe has structure on all scales, and one may only hope for statistical (not exact) homogeneity. The Universe is expected to be lumpy. A background FLRW metric is not expected to exist and quantities analogous to the Hubble and deceleration parameters will vary across the sky.  Peculiar velocities may be more precisely thought of as variations in the expansion rate of the Universe. At what rate does a real Universe with structure expand? The problems of defining a meaningful average notion of volume, its dynamical evolution, and connecting it to observations are all conceptually open.

On the 2011 Nobel Prize in Physics:

The Fitting Problem in cosmology was written in 1987. In the context of this work and the significant theoretical difficulties involved in inferring fundamental physics from the real Universe, any claims of having measured a cosmological constant from directionally skewed, sparse samples of intrinsically scattered observations should have been taken with a grain of salt.  By honouring this claim with a Nobel Prize, the Swedish Academy may have induced runaway prestige bias in favour of some of the least principled analyses in science, strengthening the confirmation bias that seems prevalent in cosmology.

This has resulted in the generation of a large body of misleading literature, while normalizing the practice of ‘massaging’ scientific data. In her recent video about gravitational waves, Sabine Hossenfelder says “We should not hand out Nobel Prizes if we don’t know how the predictions were fitted to the data”. What about when the data was fitted (in 1998-1999) using a method that has been discredited in 1989 to a toy model that has been cautioned against in 1987, leading to a ‘discovery’ of profound significance to fundamental physics?

A prediction with future cosmological data:

With the advent of high statistics cosmological data in the future, such as from the Large Synoptic Survey Telescope, I predict that the Hubble and deceleration parameters inferred from supernovae in hemispheres towards and away from the CMB dipole will be found to be different in a statistically significant (>5$\sigma$ ) way. Depending upon the criterion for selection and blind analyses of data that can be agreed upon, I would be willing to bet a substantial amount of money on this prediction.

Concluding : on the amusing sociology of ‘Dark Energy’ and manufactured concordance:

Of the two authors of the well-known cosmology textbook ‘The Early Universe’, Edward Kolb writes these interesting papers questioning dark energy while Michael Turner is credited with coining the term ‘Dark Energy’.  Reasonable scientific perspectives have to be presented as ‘Dark Energy without dark energy’. Papers questioning the need to invoke such a mysterious content that makes up ‘68% of the Universe’ are quickly targeted by inane articles by non-experts or perhaps well-meant but still misleading YouTube videos. Much of this is nothing more than a spectacle.

In summary, while the theoretical debate about whether what has been observed as Dark Energy is the effect of inhomogeneities is ongoing, observers appear to have been actively using the most inhomogeneous feature of the local Universe through opaque corrections to data, to continue claiming that this ‘dark energy’ exists.

It is heartening to see that recent works lean toward a breaking of this manufactured concordance and speak of a crisis for cosmology.

Questions that should be frequently asked:

Q. Is there a Hubble frame in the late time Universe?

A. The Hubble frame is a property of the FLRW exact solution, and in the late time Universe in which galaxies and clusters have peculiar motions with respect to each other, an equivalent notion does not exist. While popular inference treats the frame in which the CMB dipole vanishes as the Hubble frame, the scale at which the bulk flow of the local Universe converges to that frame has never been found. We are tilted observers.

Q. I am about to perform blinded analyses on new cosmological data. Should I correct all my redshifts towards the CMB rest frame?

A. No. Correcting all your redshifts towards a frame that has never been found is a good way to end up with ‘dark energy’. It is worth noting that while the CMB dipole has been known since 1994, supernova data have been corrected towards the CMB rest frame only after 2010, for what appear to be independent reasons.

Q. Can I combine new data with existing Supernova data?

A. No. The current generation of publicly available supernova data suffer from the natural biases that are to be expected when data are compiled incrementally through a human mediated process. It would be better to start fresh with a new sample.

Q. Is ‘dark energy’ fundamental or new physics?

A. Given that general relativity is a 100+ year old theory and significant difficulties exist in describing the late time Universe with it, it is unnecessary to invoke new fundamental physics when confronting any apparent acceleration of the real Universe. All signs suggest that what has been ascribed to dark energy are the result of a community that is hell bent on repeating what Einstein supposedly called his greatest mistake.

Digging deeper:

The inquisitive reader may explore the resources on inhomogeneous cosmology, as well as the works of George Ellis, Thomas Buchert and David Wiltshire.

Still Traveling, and a Black Hole

I’m still at the conference in Natal this week, so I don’t have time for a long post. The big news this week was the Event Horizon Telescope’s close-up of the black hole at the center of galaxy M87. If you’re hungry for coverage of that, Matt Strassler has some of his trademark exceptionally clear posts on the topic, while Katie Mack has a nice twitter thread.

Cosmology, or Cosmic Horror?

Around Halloween, I have a tradition of posting about the “spooky” side of physics. This year, I’ll be comparing two no doubt often confused topics, Cosmic Horror and Cosmology.

Pro tip: if this guy shows up, it’s probably Cosmic Horror

Cosmology

Started in the 1920’s with the work of Howard Phillips Lovecraft Started in the 1920’s with the work of Alexander Friedmann
Unimaginably ancient universe Precisely imagined ancient universe
In strange ages even death may die Strange ages, what redshift is that?
An expedition to Antarctica uncovers ruins of a terrifying alien civilization An expedition to Antarctica uncovers…actually, never mind, just dust
Alien beings may propagate in hidden dimensions Gravitons may propagate in hidden dimensions
Cultists compete to be last to be eaten by the Elder Gods Grad students compete to be last to realize there are no jobs
Oceanic “deep ones” breed with humans Have you seen daycare costs in a university town? No way.
Variety of inventive and bizarre creatures, inspiring libraries worth of copycat works Fritz Zwicky
Hollywood adaptations are increasingly popular, not very faithful to source material Actually this is exactly the same
Can waste hours on an ultimately fruitless game of Arkham Horror Can waste hours on an ultimately fruitless argument with Paul Steinhardt
No matter what we do, eventually Azathoth will kill us all No matter what we do, eventually vacuum decay will kill us all

The Physics Isn’t New, We Are

Last week, I mentioned the announcement from the IceCube, Fermi-LAT, and MAGIC collaborations of high-energy neutrinos and gamma rays detected from the same source, the blazar TXS 0506+056. Blazars are sources of gamma rays, thought to be enormous spinning black holes that act like particle colliders vastly more powerful than the LHC. This one, near Orion’s elbow, is “aimed” roughly at Earth, allowing us to detect the light and particles it emits. On September 22, a neutrino with energy around 300 TeV was detected by IceCube (a kilometer-wide block of Antarctic ice stuffed with detectors), coming from the direction of TXS 0506+056. Soon after, the satellite Fermi-LAT and ground-based telescope MAGIC were able to confirm that the blazar TXS 0506+056 was flaring at the time. The IceCube team then looked back, and found more neutrinos coming from the same source in earlier years. There are still lingering questions (Why didn’t they see this kind of behavior from other, closer blazars?) but it’s still a nice development in the emerging field of “multi-messenger” astronomy.

It also got me thinking about a conversation I had a while back, before one of Perimeter’s Public Lectures. An elderly fellow was worried about the LHC. He wondered if putting all of that energy in the same place, again and again, might do something unprecedented: weaken the fabric of space and time, perhaps, until it breaks? He acknowledged this didn’t make physical sense, but what if we’re wrong about the physics? Do we really want to take that risk?

At the time, I made the same point that gets made to counter fears of the LHC creating a black hole: that the energy of the LHC is less than the energy of cosmic rays, particles from space that collide with our atmosphere on a regular basis. If there was any danger, it would have already happened. Now, knowing about blazars, I can make a similar point: there are “galactic colliders” with energies so much higher than any machine we can build that there’s no chance we could screw things up on that kind of scale: if we could, they already would have.

This connects to a broader point, about how to frame particle physics. Each time we build an experiment, we’re replicating something that’s happened before. Our technology simply isn’t powerful enough to do something truly unprecedented in the universe: we’re not even close! Instead, the point of an experiment is to reproduce something where we can see it. It’s not the physics itself, but our involvement in it, our understanding of it, that’s genuinely new.

The IceCube experiment itself is a great example of this: throughout Antarctica, neutrinos collide with ice. The only difference is that in IceCube’s ice, we can see them do it. More broadly, I have to wonder how much this is behind the “unreasonable effectiveness of mathematics”: if mathematics is just the most precise way humans have to communicate with each other, then of course it will be effective in physics, since the goal of physics is to communicate the nature of the world to humans!

There may well come a day when we’re really able to do something truly unprecedented, that has never been done before in the history of the universe. Until then, we’re playing catch-up, taking laws the universe has tested extensively and making them legible, getting humanity that much closer to understanding physics that, somewhere out there, already exists.

Bubbles of Nothing

I recently learned about a very cool concept, called a bubble of nothing.

Read about physics long enough, and you’ll hear all sorts of cosmic disaster scenarios. If the Higgs vacuum decays, and the Higgs field switches to a different value, then the masses of most fundamental particles would change. It would be the end of physics, and life, as we know it.

A bubble of nothing is even more extreme. In a bubble of nothing, space itself ceases to exist.

The idea was first explored by Witten in 1982. Witten started with a simple model, a world with our four familiar dimensions of space and time, plus one curled-up extra dimension. What he found was that this simple world is unstable: quantum mechanics (and, as was later found, thermodynamics) lets it “tunnel” to another world, one that contains a small “bubble”, a sphere in which nothing at all exists.

Except perhaps the Nowhere Man

A bubble of nothing might sound like a black hole, but it’s quite different. Throw a particle into a black hole and it will fall in, never to return. Throw it into a bubble of nothing, though, and something more interesting happens. As you get closer, the extra dimension of space gets smaller and smaller. Eventually, it stops, smoothly closing off. The particle you threw in will just bounce back, smoothly, off the outside of the bubble. Essentially, it reached the edge of the universe.

The bubble starts out small, comparable to the size of the curled-up dimension. But it doesn’t stay that way. In Witten’s setup, the bubble grows, faster and faster, until it’s moving at the speed of light, erasing the rest of the universe from existence.

You probably shouldn’t worry about this happening to us. As far as I’m aware, nobody has written down a realistic model that can transform into a bubble of nothing.

Still, it’s an evocative concept, and one I’m surprised isn’t used more often in science fiction. I could see writers using a bubble of nothing as a risk from an experimental FTL drive, or using a stable (or slowly growing) bubble as the relic of some catastrophic alien war. The idea of a bubble of literal nothing is haunting enough that it ought to be put to good use.

A LIGO in the Darkness

For the few of you who haven’t yet heard: LIGO has detected gravitational waves from a pair of colliding neutron stars, and that detection has been confirmed by observations of the light from those stars.

They also provide a handy fact sheet.

This is a big deal! On a basic level, it means that we now have confirmation from other instruments and sources that LIGO is really detecting gravitational waves.

The implications go quite a bit further than that, though. You wouldn’t think that just one observation could tell you very much, but this is an observation of an entirely new type, the first time an event has been seen in both gravitational waves and light.

That, it turns out, means that this one observation clears up a whole pile of mysteries in one blow. It shows that at least some gamma ray bursts are caused by colliding neutron stars, that neutron star collisions can give rise to the high-power “kilonovas” capable of forming heavy elements like gold…well, I’m not going to be able to give justice to the full implications in this post. Matt Strassler has a pair of quite detailed posts on the subject, and Quanta magazine’s article has a really great account of the effort that went into the detection, including coordinating the network of telescopes that made it possible.

I’ll focus here on a few aspects that stood out to me.

One fun part of the story behind this detection was how helpful “failed” observations were. VIRGO (the European gravitational wave experiment) was running alongside LIGO at the time, but VIRGO didn’t see the event (or saw it so faintly it couldn’t be sure it saw it). This was actually useful, because VIRGO has a blind spot, and VIRGO’s non-observation told them the event had to have happened in that blind spot. That narrowed things down considerably, and allowed telescopes to close in on the actual merger. IceCube, the neutrino observatory that is literally a cubic kilometer chunk of Antarctica filled with sensors, also failed to detect the event, and this was also useful: along with evidence from other telescopes, it suggests that the “jet” of particles emitted by the merged neutron stars is tilted away from us.

One thing brought up at LIGO’s announcement was that seeing gravitational waves and electromagnetic light at roughly the same time puts limits on any difference between the speed of light and the speed of gravity. At the time I wondered if this was just a throwaway line, but it turns out a variety of proposed modifications of gravity predict that gravitational waves will travel slower than light. This event rules out many of those models, and tightly constrains others.

The announcement from LIGO was screened at NBI, but they didn’t show the full press release. Instead, they cut to a discussion for local news featuring NBI researchers from the various telescope collaborations that observed the event. Some of this discussion was in Danish, so it was only later that I heard about the possibility of using the simultaneous measurement of gravitational waves and light to measure the expansion of the universe. While this event by itself didn’t result in a very precise measurement, as more collisions are observed the statistics will get better, which will hopefully clear up a discrepancy between two previous measures of the expansion rate.

A few news sources made it sound like observing the light from the kilonova has let scientists see directly which heavy elements were produced by the event. That isn’t quite true, as stressed by some of the folks I talked to at NBI. What is true is that the light was consistent with patterns observed in past kilonovas, which are estimated to be powerful enough to produce these heavy elements. However, actually pointing out the lines corresponding to these elements in the spectrum of the event hasn’t been done yet, though it may be possible with further analysis.

A few posts back, I mentioned a group at NBI who had been critical of LIGO’s data analysis and raised doubts of whether they detected gravitational waves at all. There’s not much I can say about this until they’ve commented publicly, but do keep an eye on the arXiv in the next week or two. Despite the optimistic stance I take in the rest of this post, the impression I get from folks here is that things are far from fully resolved.

Congratulations to Rainer Weiss, Barry Barish, and Kip Thorne!

The Nobel Prize in Physics was announced this week, awarded to Rainer Weiss, Kip Thorne, and Barry Barish for their work on LIGO, the gravitational wave detector.

Many expected the Nobel to go to LIGO last year, but the Nobel committee waited. At the time, it was expected the prize would be awarded to Rainer Weiss, Kip Thorne, and Ronald Drever, the three founders of the LIGO project, but there were advocates for Barry Barish was well. Traditionally, the Nobel is awarded to at most three people, so the argument got fairly heated, with opponents arguing Barish was “just an administrator” and advocates pointing out that he was “just the administrator without whom the project would have been cancelled in the 90’s”.

All of this ended up being irrelevant when Drever died last March. The Nobel isn’t awarded posthumously, so the list of obvious candidates (or at least obvious candidates who worked on LIGO) was down to three, which simplified thing considerably for the committee.

LIGO’s work is impressive and clearly Nobel-worthy, but I would be remiss if I didn’t mention that there is some controversy around it. In June, several of my current colleagues at the Niels Bohr Institute uploaded a paper arguing that if you subtract the gravitational wave signal that LIGO claims to have found then the remaining data, the “noise”, is still correlated between LIGO’s two detectors, which it shouldn’t be if it were actually just noise. LIGO hasn’t released an official response yet, but a LIGO postdoc responded with a guest post on Sean Carroll’s blog, and the team at NBI had responses of their own.

I’d usually be fairly skeptical of this kind of argument: it’s easy for an outsider looking at the data from a big experiment like this to miss important technical details that make the collaboration’s analysis work. That said, having seen some conversations between these folks, I’m a bit more sympathetic. LIGO hadn’t been communicating very clearly initially, and it led to a lot of unnecessary confusion on both sides.

One thing that I don’t think has been emphasized enough is that there are two claims LIGO is making: that they detected gravitational waves, and that they detected gravitational waves from black holes of specific masses at a specific distance. The former claim could be supported by the existence of correlated events between the detectors, without many assumptions as to what the signals should look like. The team at NBI seem to have found a correlation of that sort, but I don’t know if they still think the argument in that paper holds given what they’ve said elsewhere.

The second claim, that the waves were from a collision of black holes with specific masses, requires more work. LIGO compares the signal to various models, or “templates”, of black hole events, trying to find one that matches well. This is what the group at NBI subtracts to get the noise contribution. There’s a lot of potential for error in this sort of template-matching. If two templates are quite similar, it may be that the experiment can’t tell the difference between them. At the same time, the individual template predictions have their own sources of uncertainty, coming from numerical simulations and “loops” in particle physics-style calculations. I haven’t yet found a clear explanation from LIGO of how they take these various sources of error into account. It could well be that even if they definitely saw gravitational waves, they don’t actually have clear evidence for the specific black hole masses they claim to have seen.

I’m sure we’ll hear more about this in the coming months, as both groups continue to talk through their disagreement. Hopefully we’ll get a clearer picture of what’s going on. In the meantime, though, Weiss, Barish, and Thorne have accomplished something impressive regardless, and should enjoy their Nobel.