The Problem of Quantum Gravity Is the Problem of High-Energy (Density) Quantum Gravity

I’ve said something like this before, but here’s another way to say it.

The problem of quantum gravity is one of the most famous problems in physics. You’ve probably heard someone say that quantum mechanics and general relativity are fundamentally incompatible. Most likely, this was narrated over pictures of a foaming, fluctuating grid of space-time. Based on that, you might think that all we have to do to solve this problem is to measure some quantum property of gravity. Maybe we could make a superposition of two different gravitational fields, see what happens, and solve the problem that way.

I mean, we could do that, some people are trying to. But it won’t solve the problem. That’s because the problem of quantum gravity isn’t just the problem of quantum gravity. It’s the problem of high-energy quantum gravity.

Merging quantum mechanics and general relativity is actually pretty easy. General relativity is a big conceptual leap, certainly, a theory in which gravity is really just the shape of space-time. At the same time, though, it’s also a field theory, the same general type of theory as electromagnetism. It’s a weirder field theory than electromagnetism, to be sure, one with deeper implications. But if we want to describe low energies, and weak gravitational fields, then we can treat it just like any other field theory. We know how to write down some pretty reasonable-looking equations, we know how to do some basic calculations with them. This part is just not that scary.

The scary part happens later. The theory we get from these reasonable-looking equations continues to look reasonable for a while. It gives formulas for the probability of things happening: things like gravitational waves bouncing off each other, as they travel through space. The problem comes when those waves have very high energy, and the nice reasonable probability formula now says that the probability is greater than one.

For those of you who haven’t taken a math class in a while, probabilities greater than one don’t make sense. A probability of one is a certainty, something guaranteed to happen. A probability greater than one isn’t more certain than certain, it’s just nonsense.

So we know something needs to change, we know we need a new theory. But we only know we need that theory when the energy is very high: when it’s the Planck energy. Before then, we might still have a different theory, but we might not: it’s not a “problem” yet.

Now, a few of you understand this part, but still have a misunderstanding. The Planck energy seems high for particle physics, but it isn’t high in an absolute sense: it’s about the energy in a tank of gasoline. Does that mean that all we have to do to measure quantum gravity is to make a quantum state out of your car?

Again, no. That’s because the problem of quantum gravity isn’t just the problem of high-energy quantum gravity either.

Energy seems objective, but it’s not. It’s subjective, or more specifically, relative. Due to special relativity, observers moving at different speeds observe different energies. Because of that, high energy alone can’t be the requirement: it isn’t something either general relativity or quantum field theory can “care about” by itself.

Instead, the real thing that matters is something that’s invariant under special relativity. This is hard to define in general terms, but it’s best to think of it as a requirement for not energy, but energy density.

(For the experts: I’m justifying this phrasing in part because of how you can interpret the quantity appearing in energy conditions as the energy density measured by an observer. This still isn’t the correct way to put it, but I can’t think of a better way that would be understandable to a non-technical reader. If you have one, let me know!)

Why do we need quantum gravity to fully understand black holes? Not just because they have a lot of mass, but because they have a lot of mass concentrated in a small area, a high energy density. Ditto for the Big Bang, when the whole universe had a very large energy density. Particle colliders are useful not just because they give particles high energy, but because they give particles high energy and put them close together, creating a situation with very high energy density.

Once you understand this, you can use it to think about whether some experiment or observation will help with the problem of quantum gravity. Does the experiment involve very high energy density, much higher than anything we can do in a particle collider right now? Is that telescope looking at something created in conditions of very high energy density, or just something nearby?

It’s not impossible for an experiment that doesn’t meet these conditions to find something. Whatever the correct quantum gravity theory is, it might be different from our current theories in a more dramatic way, one that’s easier to measure. But the only guarantee, the only situation where we know we need a new theory, is for very high energy density.

21 thoughts on “The Problem of Quantum Gravity Is the Problem of High-Energy (Density) Quantum Gravity

  1. Andrew

    Interesting post! There are also of course low-energy problems in quasi-CFT gauge theories like QCD in which it appears that probabilities surpass unity. However, in that case, it is more natural to reinterpret the probability of a gluon emission, say, with an expectation value of the number of gluons emitted. Is there something like that for quantum gravity, though in the UV?

    From the perspective of your post, does QED also have a UV problem? QED has a Landau pole in the UV and so technically is trivial.

    Regarding the issue of “energy density”, the lowest energy density at which any observer would see a problem with quantum gravity would be in the proper frame. So that is the minimal energy density at which quantum gravity must be resolved somehow.

    Like

    Reply
    1. 4gravitons Post author

      There isn’t something like that picture of QCD for gravity, no. It’s a big part of why UV problems are taken as more worrying than IR problems. It would be cool if there was some UV/IR correspondence that let the same solution work in both directions, but nobody has found anything like that.

      QED has a different kind of UV problem (the Landau pole doesn’t actually violate unitarity IIRC, no probabilities greater than one), but I think most people would say it has a UV problem, yeah. Though I have seen some weird recent papers arguing that this isn’t actually a problem.

      “Proper frame energy density” is a great way to clarify the concept, yeah. It maps pretty decently to the QFT specification of the problem, “center of mass energy”.

      Like

      Reply
  2. Andrei

    “You’ve probably heard someone say that quantum mechanics and general relativity are fundamentally incompatible.”

    Indeed.

    “Based on that, you might think that all we have to do to solve this problem is to measure some quantum property of gravity. Maybe we could make a superposition of two different gravitational fields, see what happens, and solve the problem that way.”

    I think the relevant experiment is already done and we know what the problem is. The experiment is the EPR-Bohm one. This experiment tells us that a fundamental probabilistic / indeterministic theory must be non-local. This is the problem.

    “The problem comes when those waves have very high energy, and the nice reasonable probability formula now says that the probability is greater than one.”

    There is nothing surprising here. GR is local, a probabilistic theory must be non-local. This is why those probabilities don’t make sense in the context of GR.

    The solution is straightforward. Want a probabilistic theory? Fine, but use a spacetime that can accommodate non-locality, like Newton’s absolute space and time. Replacing SR should be easy since the Lorentz ether theory gives similar predictions. It seems that it is also possible to replace GR as well.

    Ilja Shmelzer published this:

    Generalization Of Lorentz-Poincare Ether Theory To Quantum Gravity

    Click to access 9706055.pdf

    You can also take the other route, find a deterministic interpretation of QM and then combine it to GR. This interpretation will not have probabilities, obviously, so they wouldn’t get larger than one.

    Like

    Reply
    1. 4gravitons Post author

      The type of nonlocality of QFT you’re concerned with has nothing to do with the unitarity violation/probabilities greater than one in quantum gravity. The kind of nonlocality people invoke to solve the problem of quantum gravity is the other one, the one focused purely on observables. Bohmian quantum gravity (to the extent that Bohmian QFT works at all) doesn’t help you get a theory that works to high energies.

      Regarding your last comment, I do wonder if there’s some pathology visible from a “classical” perspective, where instead of writing down probabilities you write down some “classical observable” derived from those probabilities. Can you see unitarity violation in the KMOC framework?

      Like

      Reply
      1. TwoBs

        About a pathology visible from the classical perspective, one can prove that negative time delay in the eikonal scattering (hence an observable time advance that survives in the classical limit) requires breaking either microcausality or positivity of probabilities (technically, positivity of the imaginary part of the scattering phase) of the underlying quantum theory.

        I can’t resist in adding that, however, I really don’t understand the success of KMOC, it looks sort of trivial or common knownledge to particle physicists, I couldn’t learn anything out of it, so that I am probably missing something about it or that the audience was mostly classical general relativists.

        Like

        Reply
        1. 4gravitons Post author

          Thanks! Yeah, I hadn’t thought to connect the positivity/time advance story here, but of course that’s a useful way to think about it. Though it sounds like the implication goes the wrong way, right? You have “time advance” -> “unitarity/locality violation” but not “unitarity/locality violation”->”time advances”, which is what would be needed to argue that naive QG inevitably leads to pathologies visible from the classical perspective. But maybe I’m misunderstanding you here. Does eikonal scattering in classical GR have observable time advances?

          WRT KMOC, I think the point was less to do something revolutionary and more to establish a formalism Schelling point for the amplitudes-GW community. Even if it’s the kind of thing everybody in particle physics could in principle do, there’s enough finicky details that it makes more sense for one group to do it right and others to use that language. (Incidentally, I can see your email address from the WordPress back-end, which makes me suspect you’ve had the chance to talk to some of the authors about this. 😉 )

          Like

          Reply
          1. TwoBs

            The implication goes the way you say. I just meant to say that this suggets where to look for the inconsistencies of an UV theory that may be visible in the IR, and in the case at hand in a classical observable in the IR.

            Probably this was too tangential however to the goal of your post, sorry about that. To almost everyone naive QG is just an EFT, basically by dimensional analysis since the 4-boidy amplitudes scale as GE^2. Therefore it obviously makes stop making sense at high energy. (As further aside comment, the bounds of the time-delay type, or the positivity bounds, are much stronger or detailed arguments than perturbative unitarity, say, because the amplitudes never have to get large in order to see a pathology in the underlying theory; the bounds can be used to exclude theory at scales where they are weakly coupled).

            “Does eikonal scattering in classical GR have observable time advances?”

            No it doesn’t. However, thinking of classical GR as the just the classical limit of a quantum EFT (as one should), one expects higher-derivative terms such as e.g. Rie^3 to be there too, and they can be relevant in the appropriate regime (say when scattering black-holes of lenght-size much larger than planck length, while being yet smaller than some particle’s compton wavelenght that integrated out generates the 3pt function +++).
            This GR+HD-terms does generate observable time-advance unless new physics kicks in at the scale hinted by those higher-D operators, as nicely explained in the CEMZ paper. Of course this is too far from your original intent in this post, I am sorry about that again, I will refrain myself from other comments.

            Never mind the KMOC, it’s a nice paper indeed, very clear for sure. I was just a bit surprised by the reaction of the community, I found it a but disproportionate. So I found myself wondering if I had perhaps missed something, or rather if this was one of the examples that can tell us about social behavior in our community.

            Like

            Reply
            1. 4gravitons Post author

              Yeah indeed, the point I was making in this post is just precisely that the four-point amplitude grows with the square of the energy, and that this guarantees that it isn’t going to make sense and thus one expects new physics (leading to higher-dimension operators eventually etc.)

              I was mostly just curious if there were some way to phrase the issue that doesn’t invoke amplitudes or probabilities, but rather observables with a clear classical limit. But there’s no guarantee such a thing exists I suppose.

              Like

              Reply
              1. TwoBs

                Yeah, It’s tricky, which makes it an interesting question.

                One could naively think that saying that M~G E^2 plus the fact there is a long range force (so that actually the M contains a t-channel pole, schematically M~G E^2/(1-cos\theta) ), would imply that classical scattering phase and the scattering angle theta would grow with energy and thus stop making sense at large energy or small impact parameter, i.e. theta~R_s/b~1.
                But this would be non-sense, because the amplitude gets large precisely because of the classical black-hole formation that kick in, not because something bad is happening to GR (I am assuming string length much smaller than R_s). Actually it is the other way around, it’s non-linear GR to its best that is happening.

                Liked by 1 person

                Reply
  3. Andrei

    4gravitons,

    I think that there is only one type of nonlocality, the one associated with collapse. Since almost all measurements involve such a collapse and since observables are measurement results it follows that this nonlocality would appear in any context. That means that in order to get meaningful results you need a framework that can accommodate such a nonlocal behavior. So, I would say that the fact that quantum gravity does not work is not as surprising as the fact that the standard model works.

    I guess that a possible explanation for the success of the standard model is that SR can be interpreted as a Lorentz ether theory (as far I know the two theories are equivalent). So, I guess, the correct way to present SM is not a relativistic theory, but a non-local theory on a Newtonian background. The absolute reference frame would be the frame in which the ether is at rest.

    As far as I know the trick does not work in GR since no equivalent of GR on a Newtonian background is known. So, a non-local QM does not work and cannot work on such a background. I am curios here, doesn’t string theory get the job done? It shouldn’t.

    I do not understand the question about KMOC. If you can’t calculate the probabilities in the first place how can you derive a classical observable from them?

    Like

    Reply
    1. 4gravitons Post author

      The two types of nonlocality involve not just different proposed explanations, but different behavior, so your idea doesn’t work: one of the types of nonlocality doesn’t bring about the other. (You’re making it sound like you don’t quite get the difference between the concepts, I thought that was something you understood, based on past interactions.)

      The KMOC thing was mostly a note for myself, and was in the context of “naive quantum gravity”, not a putative classical explanation of quantum phenomena.

      Like

      Reply
      1. Andrei

        Let me restate my argument in a clearer way so that you can point out exactly where my confusion comes from.

        The argument based on EPR-Bohm setup proves that a measurement in a fundamentally probabilistic QM requires a non-local collapse.
        A non-local collapse requires an absolute frame of reference.
        GR, in the current formulation does not allow for an absolute frame of reference.

        C1. From 1,2 and 3 it follows that the concept of measurement in GR is incoherent.

        Any observable is associated with a measurement.

        C2. From C1 and 4 it follows that the concept of an observable in GR is incoherent.

        If you agree with C2 it means that all concerns about creating black-holes and such are irrelevant. You need to first make sure that measurements are well defined (by either changing GR to accommodate an absolute frame, or changing QM by adding deterministic hidden variables). Once this is done it remains to be seen if the black-hole problem persists or not.

        Like

        Reply
        1. 4gravitons Post author

          Nobody here has mentioned a black-hole problem as of yet. But assuming that you meant the problem I discussed in the post/the problems TwoBs mentions, neither of which involve black holes, then the problem with your phrasing is this one:

          “GR, in the current formulation does not allow for an absolute frame of reference.”

          This is true, but modifications of GR can break this property. People have studied such modifications. They don’t automatically solve the problem of quantum gravity. Just adding an absolute frame of reference is insufficient to solve the problem of quantum gravity. (We could in principle argue about whether it’s necessary, but that would boil down to your usual arguments about QM, which our discussions about don’t tend to be productive.)

          Like

          Reply
  4. ohwilleke

    Would the problem be solvable if there was a physical law that established a maximum mass-energy density (perhaps in a manner analogous to the maximum velocity being the speed of light), if the cap was sufficiently low, that we just happened to have not discovered yet because the world of our experience is so distant from the limit?

    Like

    Reply
    1. 4gravitons Post author

      That ends up violating special relativity. That doesn’t mean it’s impossible, there are people looking into this under the name Doubly Special Relativity. But it’s difficult to do this in a way that’s both mathematically consistent and consistent with existing experiments, which together make it pretty hard to violate special relativity.

      (One way to think about this: the maximum speed in special relativity comes from a space-time symmetry, Lorentz invariance. But you can prove that our list of space-time symmetries is in some sense complete: supersymmetry was a surprising loophole, but kind of the exception that proved the rule, since it involved symmetries between different types of particles, and further attempts to generalize the notion have involved things like different numbers of particles. So you can’t do the doubly-special relativity thing with a symmetry, which means it’s hard to consistently express what you’re doing.)

      Like

      Reply
        1. 4gravitons Post author

          I don’t know a ton about it. I vaguely remember hearing it had some pathology (I want to say causality violation, which might make sense if it can have fully classical traversible wormholes, as that review suggests), but that may only be true of a specific version. Regardless, it’s not quite the “extra limit of nature” thing. Born-Infeld theories prevent space-time from reaching high energy-densities, so they smooth out singularities, but I think you could still build a giant collider to access the higher energies. (You can see in the review how they discuss Born-Infeld as an Effective Field Theory, which means a theory that only works up to a certain scale and must be replaced above that scale.)

          Like

          Reply
  5. Pingback: Guiding principle: quantum gravity – Flippiefanus

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s