Author Archives: 4gravitons

All About the Collab

Sometimes, some scientists work alone. But mostly, scientists collaborate. We team up, getting more done together than we could alone.

Over the years, I’ve realized that theoretical physicists like me collaborate in a bit of a weird way, compared to other scientists. Most scientists do experiments, and those experiments require labs. Each lab typically has one principal investigator, or “PI”, who hires most of the other people in that lab. For any given project, scientists from the lab will be organized into particular roles. Some will be involved in the planning, some not. Some will do particular tests, gather data, manage lab animals, or do statistics. The whole experiment is at least roughly planned out from the beginning, and everyone has their own responsibility, to the extent that journals will sometimes ask scientists to list everyone’s roles when they publish papers. In this system, it’s rare for scientists from two different labs to collaborate. Usually it happens for a reason: a lab needs a statistician for a particularly subtle calculation, or one lab must process a sample so another lab can analyze it.

In contrast, theoretical physicists don’t have labs. Our collaborators sometimes come from the same university, but often they’re from a different one, frequently even in a different country. The way we collaborate is less like other scientists, and more like artists.

Sometimes, theoretical physicists have collaborations with dedicated roles and a detailed plan. This can happen when there is a specific calculation that needs to be done, that really needs to be done right. Some of the calculations that go into making predictions at the LHC are done in this way. I haven’t been in a collaboration like that (though in retrospect one collaborator may have had something like that in mind).

Instead, most of the collaborations I’ve been in have been more informal. They tend to start with a conversation. We chat by the coffee machine, or after a talk, anywhere there’s a blackboard nearby. It starts with “I’ve noticed something odd”, or “here’s something I don’t understand”. Then, we jam. We go back and forth, doing our thing and building on each other. Sometimes this happens in person, a barrage of questions and doubts until we hammer out something solid. Sometimes we go back to our offices, to calculate and look up references. Coming back the next day, we compare results: what did you manage to show? Did you get what I did? If not, why?

I make this sound spontaneous, but it isn’t completely. That starting conversation can be totally unplanned, but usually one of the scientists involved is trying to make it happen. There’s a different way you talk when you’re trying to start a collaboration, compared to when you just want to talk. If you’re looking for a collaboration, you go into more detail. If the other person is on the same wavelength, you start using “we” instead of “I”, or you start suggesting plans of action: “you could do X, while I do Y”. If you just want someone’s opinion, or just want to show off, then your conversation is less detailed, and less personal.

This is easiest to do with our co-workers, but we do it with people from other universities too. Sometimes this happens at conferences, more often during short visits for seminars. I’ve been on almost every end of this. As a visitor, I’ve arrived to find my hosts with a project in mind. As a host, I’ve invited a visitor with the goal of getting them involved in a collaboration, and I’ve received a visitor who came with their own collaboration idea.

After an initial flurry of work, we’ll have a rough idea of whether the project is viable. If it is, things get a bit more organized, and we sort out what needs to be done and a rough idea of who will do it. While the early stages really benefit from being done in person, this part is easier to do remotely. The calculations get longer but the concepts are clear, so each of us can work by ourselves, emailing when we make progress. If we get confused again, we can always schedule a Zoom to sort things out.

Once things are close (but often not quite done), it’s time to start writing the paper. In the past, I used Dropbox for this: my collaborators shared a folder with a draft, and we’d pass “control” back and forth as we wrote and edited. Now, I’m more likely to use something built for this purpose. Git is a tool used by programmers to collaborate on code. It lets you roll back edits you don’t like, and merge edits from two people to make sure they’re consistent. For other collaborations I use Overleaf, an online interface for the document-writing language LaTeX that lets multiple people edit in real-time. Either way, this part is also more or less organized, with a lot of “can you write this section?” that can shift around depending on how busy people end up being.

Finally, everything comes together. The edits stabilize, everyone agrees that the paper is good (or at least, that any dissatisfaction they have is too minor to be worth arguing over). We send it to a few trusted friends, then a few days later up on the arXiv it goes.

Then, the cycle begins again. If the ideas are still clear enough, the same collaboration might keep going, planning follow-up work and follow-up papers. We meet new people, or meet up with old ones, and establish new collaborations as we go. Our fortunes ebb and flow based on the conversations we have, the merits of our ideas and the strengths of our jams. Sometimes there’s more, sometimes less, but it keeps bubbling up if you let it.

LHC Black Holes for the Terminally Un-Reassured

Could the LHC have killed us all?

No, no it could not.

But…

I’ve had this conversation a few times over the years. Usually, the people I’m talking to are worried about black holes. They’ve heard that the Large Hadron Collider speeds up particles to amazingly high energies before colliding them together. They worry that these colliding particles could form a black hole, which would fall into the center of the Earth and busily gobble up the whole planet.

This pretty clearly hasn’t happened. But also, physicists were pretty confident that it couldn’t happen. That isn’t to say they thought it was impossible to make a black hole with the LHC. Some physicists actually hoped to make a black hole: it would have been evidence for extra dimensions, curled-up dimensions much larger than the tiny ones required by string theory. They figured out the kind of evidence they’d see if the LHC did indeed create a black hole, and we haven’t seen that evidence. But even before running the machine, they were confident that such a black hole wouldn’t gobble up the planet. Why?

The best argument is also the most unsatisfying. The LHC speeds up particles to high energies, but not unprecedentedly high energies. High-energy particles called cosmic rays enter the atmosphere every day, some of which are at energies comparable to the LHC. The LHC just puts the high-energy particles in front of a bunch of sophisticated equipment so we can measure everything about them. If the LHC could destroy the world, cosmic rays would have already done so.

That’s a very solid argument, but it doesn’t really explain why. Also, it may not be true for future colliders: we could build a collider with enough energy that cosmic rays don’t commonly meet it. So I should give another argument.

The next argument is Hawking radiation. In Stephen Hawking’s most famous accomplishment, he argued that because of quantum mechanics black holes are not truly black. Instead, they give off a constant radiation of every type of particle mixed together, shrinking as it does so. The radiation is faintest for large black holes, but gets more and more intense the smaller the black hole is, until the smallest black holes explode into a shower of particles and disappear. This argument means that a black hole small enough that the LHC could produce it would radiate away to nothing in almost an instant: not long enough to leave the machine, let alone fall to the center of the Earth.

This is a good argument, but maybe you aren’t as sure as I am about Hawking radiation. As it turns out, we’ve never measured Hawking radiation, it’s just a theoretical expectation. Remember that the radiation gets fainter the larger the black hole is: for a black hole in space with the mass of a star, the radiation is so tiny it would be almost impossible to detect even right next to the black hole. From here, in our telescopes, we have no chance of seeing it.

So suppose tiny black holes didn’t radiate, and suppose the LHC could indeed produce them. Wouldn’t that have been dangerous?

Here, we can do a calculation. I want you to appreciate how tiny these black holes would be.

From science fiction and cartoons, you might think of a black hole as a kind of vacuum cleaner, sucking up everything nearby. That’s not how black holes work, though. The “sucking” black holes do is due to gravity, no stronger than the gravity of any other object with the same mass at the same distance. The only difference comes when you get close to the event horizon, an invisible sphere close-in around the black hole. Pass that line, and the gravity is strong enough that you will never escape.

We know how to calculate the position of the event horizon of a black hole. It’s the Schwarzchild radius, and we can write it in terms of Newton’s constant G, the mass of the black hole M, and the speed of light c, as follows:

\frac{2GM}{c^2}

The Large Hadron Collider’s two beams each have an energy around seven tera-electron-volts, or TeV, so there are 14 TeV of energy in total in each collision. Imagine all of that energy being converted into mass, and that mass forming a black hole. That isn’t how it would actually happen: some of the energy would create other particles, and some would give the black hole a “kick”, some momentum in one direction or another. But we’re going to imagine a “worst-case” scenario, so let’s assume all the energy goes to form the black hole. Electron-volts are a weird physicist unit, but if we divide them by the speed of light squared (as we should if we’re using E=mc^2 to create a mass), then Wikipedia tells us that each electron-volt will give us 1.78\times 10^{-36} kilograms. “Tera” is the SI prefix for 10^{12}. Thus our tiny black hole starts with a mass of

14\times 10^{12}\times 1.78\times 10^{-36} = 2.49\times 10^{-23} \textrm{kg}

Plugging in Newton’s constant (6.67\times 10^{-11} meters cubed per kilogram per second squared), and the speed of light (3\times 10^8 meters per second), and we get a radius of,

\frac{2\times 6.67\times 10^{-11}\times 14\times 10^{12}\times 1.78\times 10^{-36}}{\left(3\times 10^8\right)^2} = 3.7\times 10^{-50} \textrm{m}

That, by the way, is amazingly tiny. The size of an atom is about 10^{-10} meters. If every atom was a tiny person, and each of that person’s atoms was itself a person, and so on for five levels down, then the atoms of the smallest person would be the same size as this event horizon.

Now, we let this little tiny black hole fall. Let’s imagine it falls directly towards the center of the Earth. The only force affecting it would be gravity (if it had an electrical charge, it would quickly attract a few electrons and become neutral). That means you can think of it as if it were falling through a tiny hole, with no friction, gobbling up anything unfortunate enough to fall within its event horizon.

For our first estimate, we’ll treat the black hole as if it stays the same size through its journey. Imagine the black hole travels through the entire earth, absorbing a cylinder of matter. Using the Earth’s average density of 5515 kilograms per cubic meter, and the Earth’s maximum radius of 6378 kilometers, our cylinder adds a mass of,

\pi \times \left(3.7\times 10^{-50}\right)^2 \times 2 \times 6378\times 10^3\times 5515 = 3\times 10^{-88} \textrm{kg}

That’s absurdly tiny. That’s much, much, much tinier than the mass we started out with. Absorbing an entire cylinder through the Earth makes barely any difference.

You might object, though, that the black hole is gaining mass as it goes. So really we ought to use a differential equation. If the black hole travels a distance r, absorbing mass as it goes at average Earth density \rho, then we find,

\frac{dM}{dr}=\pi\rho\left(\frac{2GM(r)}{c^2}\right)^2

Solving this, we get

M(r)=\frac{M_0}{1- M_0 \pi\rho\left(\frac{2G}{c^2}\right)^2 r }

Where M_0 is the mass we start out with.

Plug in the distance through the Earth for r, and we find…still about 3\times 10^{-88} \textrm{kg}! It didn’t change very much, which makes sense, it’s a very very small difference!

But you might still object. A black hole falling through the Earth wouldn’t just go straight through. It would pass through, then fall back in. In fact, it would oscillate, from one side to the other, like a pendulum. This is actually a common problem to give physics students: drop an object through a hole in the Earth, neglect air resistance, and what does it do? It turns out that the time the object takes to travel through the Earth is independent of its mass, and equal to roughly 84.5 minutes.

So let’s ask a question: how long would it take for a black hole, oscillating like this, to double its mass?

We want to solve,

2=\frac{1}{1- M_0 \pi\rho\left(\frac{2G}{c^2}\right)^2 r }

so we need the black hole to travel a total distance of

r=\frac{1}{2M_0 \pi\rho\left(\frac{2G}{c^2}\right)^2} = 5.3\times 10^{71} \textrm{m}

That’s a huge distance! The Earth’s radius, remember, is 6378 kilometers. So traveling that far would take

5.3\times 10^{71} \times 84.5/60/24/365 = 8\times 10^{67} \textrm{y}

Ten to the sixty-seven years. Our universe is only about ten to the ten years old. In another five times ten to the nine years, the Sun will enter its red giant phase, and swallow the Earth. There simply isn’t enough time for this tiny tiny black hole to gobble up the world, before everything is already gobbled up by something else. Even in the most pessimistic way to walk through the calculation, it’s just not dangerous.

I hope that, if you were worried about black holes at the LHC, you’re not worried any more. But more than that, I hope you’ve learned three lessons. First, that even the highest-energy particle physics involves tiny energies compared to day-to-day experience. Second, that gravitational effects are tiny in the context of particle physics. And third, that with Wikipedia access, you too can answer questions like this. If you’re worried, you can make an estimate, and check!

Cabinet of Curiosities: The Train-Ladder

I’ve got a new paper out this week, with Andrew McLeod, Roger Morales, Matthias Wilhelm, and Chi Zhang. It’s yet another entry in this year’s “cabinet of curiosities”, quirky Feynman diagrams with interesting traits.

A while back, I talked about a set of Feynman diagrams I could compute with any number of “loops”, bypassing the approximations we usually need to use in particle physics. That wasn’t the first time someone did that. Back in the 90’s, some folks figured out how to do this for so-called “ladder” diagrams. These diagrams have two legs on one end for two particles coming in, two legs on the other end for two particles going out, and a ladder in between, like so:

There are infinitely many of these diagrams, but they’re all beautifully simple, variations on a theme that can be written down in a precise mathematical way.

Change things a little bit, though, and the situation gets wildly more intractable. Let the rungs of the ladder peek through the sides, and you get something looking more like the tracks for a train:

These traintrack integrals are much more complicated. Describing them requires the mathematics of Calabi-Yau manifolds, involving higher and higher dimensions as the tracks get longer. I don’t think there’s any hope of understanding these things for all loops, at least not any time soon.

What if we aimed somewhere in between? A ladder that just started to turn traintrack?

Add just a single pair of rungs, and it turns out that things remain relatively simple. If we do this, it turns out we don’t need any complicated Calabi-Yau manifolds. We just need the simplest Calabi-Yau manifold, called an elliptic curve. It’s actually the same curve for every version of the diagram. And the situation is simple enough that, with some extra cleverness, it looks like we’ve found a trick to calculate these diagrams to any number of loops we’d like.

(Another group figured out the curve, but not the calculation trick. They’ve solved different problems, though, studying all sorts of different traintrack diagrams. They sorted out some confusion I used to have about one of those diagrams, showing it actually behaves precisely the way we expected it to. All in all, it’s been a fun example of the way different scientists sometimes hone in on the same discovery.)

These developments are exciting, because Feynman diagrams with elliptic curves are still tough to deal with. We still have whole conferences about them. These new elliptic diagrams can be a long list of test cases, things we can experiment with with any number of loops. With time, we might truly understand them as well as the ladder diagrams!

The Problem of Quantum Gravity Is the Problem of High-Energy (Density) Quantum Gravity

I’ve said something like this before, but here’s another way to say it.

The problem of quantum gravity is one of the most famous problems in physics. You’ve probably heard someone say that quantum mechanics and general relativity are fundamentally incompatible. Most likely, this was narrated over pictures of a foaming, fluctuating grid of space-time. Based on that, you might think that all we have to do to solve this problem is to measure some quantum property of gravity. Maybe we could make a superposition of two different gravitational fields, see what happens, and solve the problem that way.

I mean, we could do that, some people are trying to. But it won’t solve the problem. That’s because the problem of quantum gravity isn’t just the problem of quantum gravity. It’s the problem of high-energy quantum gravity.

Merging quantum mechanics and general relativity is actually pretty easy. General relativity is a big conceptual leap, certainly, a theory in which gravity is really just the shape of space-time. At the same time, though, it’s also a field theory, the same general type of theory as electromagnetism. It’s a weirder field theory than electromagnetism, to be sure, one with deeper implications. But if we want to describe low energies, and weak gravitational fields, then we can treat it just like any other field theory. We know how to write down some pretty reasonable-looking equations, we know how to do some basic calculations with them. This part is just not that scary.

The scary part happens later. The theory we get from these reasonable-looking equations continues to look reasonable for a while. It gives formulas for the probability of things happening: things like gravitational waves bouncing off each other, as they travel through space. The problem comes when those waves have very high energy, and the nice reasonable probability formula now says that the probability is greater than one.

For those of you who haven’t taken a math class in a while, probabilities greater than one don’t make sense. A probability of one is a certainty, something guaranteed to happen. A probability greater than one isn’t more certain than certain, it’s just nonsense.

So we know something needs to change, we know we need a new theory. But we only know we need that theory when the energy is very high: when it’s the Planck energy. Before then, we might still have a different theory, but we might not: it’s not a “problem” yet.

Now, a few of you understand this part, but still have a misunderstanding. The Planck energy seems high for particle physics, but it isn’t high in an absolute sense: it’s about the energy in a tank of gasoline. Does that mean that all we have to do to measure quantum gravity is to make a quantum state out of your car?

Again, no. That’s because the problem of quantum gravity isn’t just the problem of high-energy quantum gravity either.

Energy seems objective, but it’s not. It’s subjective, or more specifically, relative. Due to special relativity, observers moving at different speeds observe different energies. Because of that, high energy alone can’t be the requirement: it isn’t something either general relativity or quantum field theory can “care about” by itself.

Instead, the real thing that matters is something that’s invariant under special relativity. This is hard to define in general terms, but it’s best to think of it as a requirement for not energy, but energy density.

(For the experts: I’m justifying this phrasing in part because of how you can interpret the quantity appearing in energy conditions as the energy density measured by an observer. This still isn’t the correct way to put it, but I can’t think of a better way that would be understandable to a non-technical reader. If you have one, let me know!)

Why do we need quantum gravity to fully understand black holes? Not just because they have a lot of mass, but because they have a lot of mass concentrated in a small area, a high energy density. Ditto for the Big Bang, when the whole universe had a very large energy density. Particle colliders are useful not just because they give particles high energy, but because they give particles high energy and put them close together, creating a situation with very high energy density.

Once you understand this, you can use it to think about whether some experiment or observation will help with the problem of quantum gravity. Does the experiment involve very high energy density, much higher than anything we can do in a particle collider right now? Is that telescope looking at something created in conditions of very high energy density, or just something nearby?

It’s not impossible for an experiment that doesn’t meet these conditions to find something. Whatever the correct quantum gravity theory is, it might be different from our current theories in a more dramatic way, one that’s easier to measure. But the only guarantee, the only situation where we know we need a new theory, is for very high energy density.

The Many Varieties of Journal Club

Across disciplines, one tradition seems to unite all academics: the journal club. In a journal club, we gather together to discuss papers in academic journals. Typically, one person reads the paper in depth in advance, and comes prepared with a short presentation, then everyone else asks questions. Everywhere I’ve worked has either had, or aspired to have, a journal club, and every academic I’ve talked to recognizes the concept.

Beyond that universal skeleton, though, are a lot of variable details. Each place seems to interpret journal clubs just a bit differently. Sometimes a lot differently.

For example, who participates in journal clubs? In some places, journal clubs are a student thing, organized by PhD or Master’s students to get more experience with their new field. Some even have journal clubs as formal courses, for credit and everything. In other places, journal clubs are for everyone, from students up through the older professors.

What kind of papers? Some read old classic papers, knowing that without an excuse we’d never take the time to read them and would miss valuable insights. Some instead focus on the latest results, as a way to keep up with progress in the field.

Some variation is less intentional. Academics are busy, so it can be hard to find a volunteer to prepare a presentation on a paper every week. This leads journal clubs to cut corners, in once again a variety of ways. A journal club focused on the latest papers can sometimes only find volunteers interested in presenting their own work (which we usually already have a presentation prepared for). Sometimes this goes a step further, and the journal club becomes a kind of weekly seminar: a venue for younger visitors to talk about their work that’s less formal than a normal talk. Sometimes, instead of topic, the corner cut is preparation: people still discuss new papers, but instead of preparing a presentation they just come and discuss on the fly. This gets dangerous, because after a certain point people may stop reading the papers altogether, hoping that someone else will come having read it to explain it!

Journal clubs are tricky. Academics are curious, but we’re also busy and lazy. We know it would be good for us to discuss, to keep up with new papers or read the old classics… but actually getting organized, that’s another matter!

What Might Lie Beyond, and Why

As the new year approaches, people think about the future. Me, I’m thinking about the future of fundamental physics, about what might lie beyond the Standard Model. Physicists search for many different things, with many different motivations. Some are clear missing pieces, places where the Standard Model fails and we know we’ll need to modify it. Others are based on experience, with no guarantees but an expectation that, whatever we find, it will be surprising. Finally, some are cool possibilities, ideas that would explain something or fill in a missing piece but aren’t strictly necessary.

The Almost-Sure Things

Science isn’t math, so nothing here is really a sure thing. We might yet discover a flaw in important principles like quantum mechanics and special relativity, and it might be that an experimental result we trust turns out to be flawed. But if we chose to trust those principles, and our best experiments, then these are places we know the Standard Model is incomplete:

  • Neutrino Masses: The original Standard Model’s neutrinos were massless. Eventually, physicists discovered this was wrong: neutrinos oscillate, switching between different types in a way they only could if they had different masses. This result is familiar enough that some think of it as already part of the Standard Model, not really beyond. But the masses of neutrinos involve unsolved mysteries: we don’t know what those masses are, but more, there are different ways neutrinos could have mass, and we don’t yet know which is present in nature. Neutrino masses also imply the existence of an undiscovered “sterile” neutrino, a particle that doesn’t interact with the strong, weak, or electromagnetic forces.
  • Dark Matter Phenomena (and possibly Dark Energy Phenomena): Astronomers first suggested dark matter when they observed galaxies moving at speeds inconsistent with the mass of their stars. Now, they have observed evidence for it in a wide variety of situations, evidence which seems decisively incompatible with ordinary gravity and ordinary matter. Some solve this by introducing dark matter, others by modifying gravity, but this is more of a technical difference than it sounds: in order to modify gravity, one must introduce new quantum fields, much the same way one does when introducing dark matter. The only debate is how “matter-like” those fields need to be, but either approach goes beyond the Standard Model.
  • Quantum Gravity: It isn’t as hard to unite quantum mechanics and gravity as you might think. Physicists have known for decades how to write down a naive theory of quantum gravity, one that follows the same steps one might use to derive the quantum theory of electricity and magnetism. The problem is, this theory is incomplete. It works at low energies, but as the energy increases it loses the ability to make predictions, eventually giving nonsensical answers like probabilities greater than one. We have candidate solutions to this problem, like string theory, but we might not know for a long time which solution is right.
  • Landau Poles: Here’s a more obscure one. In particle physics we can zoom in and out in our theories, using similar theories at different scales. What changes are the coupling constants, numbers that determine the strength of the different forces. You can think of this in a loosely reductionist way, with the theories at smaller scales determining the constants for theories at larger scales. This gives workable theories most of the time, but it fails for at least one part of the Standard Model. In electricity and magnetism, the coupling constant increases as you zoom in. Eventually, it becomes infinite, and what’s more, does so at a finite energy scale. It’s still not clear how we should think about this, but luckily we won’t have to very soon: this energy scale is vastly vastly higher than even the scale of quantum gravity.
  • Some Surprises Guarantee Others: The Standard Model is special in a way that gravity isn’t. Even if you dial up the energy, a Standard Model calculation will always “make sense”: you never get probabilities greater than one. This isn’t true for potential deviations from the Standard Model. If the Higgs boson turns out to interact differently than we expect, it wouldn’t just be a violation of the Standard Model on its own: it would guarantee mathematically that, at some higher energy, we’d have to find something new. That was precisely the kind of argument the LHC used to find the Higgs boson: without the Higgs, something new was guaranteed to happen within the energy range of the LHC to prevent impossible probability numbers.

The Argument from (Theoretical) Experience

Everything in this middle category rests on a particular sort of argument. It’s short of a guarantee, but stronger than a dream or a hunch. While the previous category was based on calculations in theories we already know how to write down, this category relies on our guesses about theories we don’t yet know how to write.

Suppose we had a deeper theory, one that could use fewer parameters to explain the many parameters of the Standard Model. For example, it might explain the Higgs mass, letting us predict it rather than just measuring it like we do now. We don’t have a theory like that yet, but what we do have are many toy model theories, theories that don’t describe the real world but do, in this case, have fewer parameters. We can observe how these theories work, and what kinds of discoveries scientists living in worlds described by them would make. By looking at this process, we can get a rough idea of what to expect, which things in our own world would be “explained” in other ways in these theories.

  • The Hierarchy Problem: This is also called the naturalness problem. Suppose we had a theory that explained the mass of the Higgs, one where it wasn’t just a free parameter. We don’t have such a theory for the real Higgs, but we do have many toy models with similar behavior, ones with a boson with its mass determined by something else. In these models, though, the mass of the boson is always close to the energy scale of other new particles, particles which have a role in determining its mass, or at least in postponing that determination. This was the core reason why people expected the LHC to find something besides the Higgs. Without such new particles, the large hierarchy between the mass of the Higgs and the mass of new particles becomes a mystery, one where it gets harder and harder to find a toy model with similar behavior that still predicts something like the Higgs mass.
  • The Strong CP Problem: The weak nuclear force does what must seem like a very weird thing, by violating parity symmetry: the laws that govern it are not the same when you flip the world in a mirror. This is also true when you flip all the charges as well, a combination called CP (charge plus parity). But while it may seem strange that the weak force violates this symmetry, physicists find it stranger that the strong force seems to obey it. Much like in the hierarchy problem, it is very hard to construct a toy model that both predicts a strong force that maintains CP (or almost maintains it) and doesn’t have new particles. The new particle in question, called the axion, is something some people also think may explain dark matter.
  • Matter-Antimatter Asymmetry: We don’t know the theory of quantum gravity. Even if we did, the candidate theories we have struggle to describe conditions close to the Big Bang. But while we can’t prove it, many physicists expect the quantum gravity conditions near the Big Bang to produce roughly equal amounts of matter and antimatter. Instead, matter dominates: we live in a world made almost entirely of matter, with no evidence of large antimatter areas even far out in space. This lingering mystery could be explained if some new physics was biased towards matter instead of antimatter.
  • Various Problems in Cosmology: Many open questions in cosmology fall in this category. The small value of the cosmological constant is mysterious for the same reasons the small value of the Higgs mass is, but at a much larger and harder to fix scale. The early universe surprises many cosmologists by its flatness and uniformity, which has led them to propose new physics. This surprise is not because such flatness and uniformity is mathematically impossible, but because it is not the behavior they would expect out of a theory of quantum gravity.

The Cool Possibilities

Some ideas for physics beyond the standard model aren’t required, either from experience or cold hard mathematics. Instead, they’re cool, and would be convenient. These ideas would explain things that look strange, or make for a simpler deeper theory, but they aren’t the only way to do so.

  • Grand Unified Theories: Not the same as a “theory of everything”, Grand Unified Theories unite the three “particle physics forces”: the strong nuclear force, the weak nuclear force, and electromagnetism. Under such a theory, the different parameters that determine the strengths of those forces could be predicted from one shared parameter, with the forces only seeming different at low energies. These theories often unite the different matter particles too, but they also introduce new particles and new forces. These forces would, among other things, make protons unstable, and so giant experiments have been constructed to try to detect a proton decaying into other particles. So far none has been seen.
  • Low-Energy Supersymmetry: String theory requires supersymmetry, a relationship where matter and force particles share many properties. That supersymmetry has to be “broken”, which means that while the matter and force particles have the same charges, they can have wildly different masses, so that the partner particles are all still undiscovered. Those masses may be extremely high, all the way up at the scale of quantum gravity, but they could also be low enough to test at the LHC. Physicists hoped to detect such particles, as they could have been a good solution to the hierarchy problem. Now that the LHC hasn’t found these supersymmetric particles, it is much harder to solve the problem this way, though some people are still working on it.
  • Large Extra Dimensions: String theory also involves extra dimensions, beyond our usual three space and one time. Those dimensions are by default very small, but some proposals have them substantially bigger, big enough that we could have seen evidence for them at the LHC. These proposals could explain why gravity is so much weaker than the other forces. Much like the previous members of this category though, no evidence for this has yet been found.

I think these categories are helpful, but experts may quibble about some of my choices. I also haven’t mentioned every possible thing that could be found beyond the Standard Model. If you’ve heard of something and want to know which category I’d put it in, let me know in the comments!

When Your Research Is a Cool Toy

Merry Newtonmas, everyone!

In the US, PhD students start without an advisor. As they finish their courses, different research groups make their pitch, trying to get them to join. Some promise interesting puzzles and engaging mysteries, others talk about the importance of their work, how it can help society or understand the universe.

Thinking back to my PhD, there is one pitch I remember to this day. The pitch was from the computational astrophysics group, and the message was a simple one: “we blow up stars”.

Obviously, these guys didn’t literally blow up stars: they simulated supernovas. They weren’t trying to make some weird metaphysical argument, they didn’t believe their simulation was somehow the real thing. The point they were making, instead, was emotional: blowing up stars feels cool.

Scientists can be motivated by curiosity, fame, or altruism, and these are familiar things. But an equally important motivation is a sense of play. If your job is to build tiny cars for rats, some of your motivation has to be the sheer joy of building tiny cars for rats. If you simulate supernovas, then part of your motivation can be the same as my nephew hurling stuffed animals down the stairs: that joyful moment when you yell “kaboom!”

Probably, your motivation shouldn’t just be to play with a cool toy. You need some of those “serious” scientific motivations as well. But for those of you blessed with a job where you get to say “kaboom”, you have that extra powerful reason to get up in the morning. And for those of you just starting a scientific career, may you have some cool toys under your Newtonmas tree!

Simulated Wormhole Analogies

Last week, I talked about how Google’s recent quantum simulation of a toy model wormhole was covered in the press. What I didn’t say much about, was my own opinion of the result. Was the experiment important? Was it worth doing? Did it deserve the hype?

Here on this blog, I don’t like to get into those kinds of arguments. When I talk about public understanding of science, I share the same concerns as the journalists: we all want to prevent misunderstandings, and to spread a clearer picture. I can argue that some choices hurt the public understanding and some help it, and be reasonably confident that I’m saying something meaningful, something that would resonate with their stated values.

For the bigger questions, what goals science should have and what we should praise, I have much less of a foundation. We don’t all have a clear shared standard for which science is most important. There isn’t some premise I can posit, a fundamental principle I can use to ground a logical argument.

That doesn’t mean I don’t have an opinion, though. It doesn’t even mean I can’t persuade others of it. But it means the persuasion has to be a bit more loose. For example, I can use analogies.

So let’s say I’m looking at a result like this simulated wormhole. Researchers took advanced technology (Google’s quantum computer), and used it to model a simple system. They didn’t learn anything especially new about that system (since in this case, a normal computer can simulate it better). I get the impression they didn’t learn all that much about the advanced technology: the methods used, at this point, are pretty well-known, at least to Google. I also get the impression that it wasn’t absurdly expensive: I’ve seen other people do things of a similar scale with Google’s machine, and didn’t get the impression they had to pay through the nose for the privilege. Finally, the simple system simulated happens to be “cool”: it’s a toy model studied by quantum gravity researchers, a simple version of that sci-fi standard, the traversible wormhole.

What results are like that?

Occasionally, scientists build tiny things. If the tiny things are cute enough, or cool enough, they tend to get media attention. The most recent example I can remember was a tiny snowman, three microns tall. These tiny things tend to use very advanced technology, and it’s hard to imagine the scientists learn much from making them, but it’s also hard to imagine they cost all that much to make. They’re amusing, and they absolutely get press coverage, spreading wildly over the web. I don’t think they tend to get published in Nature unless they are a bit more advanced, but I wouldn’t be too surprised if I heard of a case that did, scientific journals can be suckers for cute stories too. They don’t tend to get discussed in glowing terms linking them to historical breakthroughs.

That seems like a pretty close analogy. Taken seriously, it would suggest the wormhole simulation was probably worth doing, probably worth a press release and some media coverage, likely not worth publication in Nature, and definitely not worth being heralded as a major breakthrough.

Ok, but proponents of the experiment might argue I’m leaving something out here. This experiment isn’t just a cute simulation. It’s supposed to be a proof of principle, an early version of an experiment that will be an actually useful simulation.

As an analogy for that…did you know LIGO started taking data in 2002?

Most people first heard of the Laser Interferometer Gravitational-Wave Observatory in 2016, when they reported their first detection of gravitational waves. But that was actually “advanced LIGO”. The original LIGO ran from 2002 to 2010, and didn’t detect anything. It just wasn’t sensitive enough. Instead, it was a prototype, an early version designed to test the basic concept.

Similarly, while this wormhole situation didn’t teach anything new, future ones might. If the quantum simulation was made larger, it might be possible to simulate more complicated toy models, ones that are too complicated to simulate on a normal computer. These aren’t feasible now, but may be feasible with somewhat bigger quantum computers: still much smaller than the computers that would be needed to break encryption, or even to do simulations that are useful for chemists and materials scientists. Proponents argue that some of these quantum toy models might teach them something interesting about the mathematics of quantum gravity.

Here, though, a number of things weaken the analogy.

LIGO’s first run taught them important things about the noise they would have to deal with, things that they used to build the advanced version. The wormhole simulation didn’t show anything novel about how to use a quantum computer: the type of thing they were doing was well-understood, even if it hadn’t been used to do that yet.

Detecting gravitational waves opened up a new type of astronomy, letting us observe things we could never have observed before. For these toy models, it isn’t obvious to me that the benefit is so unique. Future versions may be difficult to classically simulate, but it wouldn’t surprise me if theorists figured out how to understand them in other ways, or gained the same insight from other toy models and moved on to new questions. They’ll have a while to figure it out, because quantum computers aren’t getting bigger all that fast. I’m very much not an expert in this type of research, so maybe I’m wrong about this…but just comparing to similar research programs, I would be surprised if the quantum simulations end up crucial here.

Finally, even if the analogy held, I don’t think it proves very much. In particular, as far as I can tell, the original LIGO didn’t get much press. At the time, I remember meeting some members of the collaboration, and they clearly didn’t have the fame the project has now. Looking through google news and the archives of the New York times, I can’t find all that much about the experiment: a few articles discussing its progress and prospects, but no grand unveiling, no big press releases.

So ultimately, I think viewing the simulation as a proof of principle makes it, if anything, less worth the hype. A prototype like that is only really valuable when it’s testing new methods, and only in so far as the thing it’s a prototype for will be revolutionary. Recently, a prototype fusion device got a lot of press for getting more energy out of a plasma than they put into it (though still much less than it takes to run the machine). People already complained about that being overhyped, and the simulated wormhole is nowhere near that level of importance.

If anything, I think the wormhole-simulators would be on a firmer footing if they thought of their work like the tiny snowmen. It’s cute, a fun side benefit of advanced technology, and as such something worth chatting about and celebrating a bit. But it’s not the start of a new era.

Simulated Wormholes for My Real Friends, Real Wormholes for My Simulated Friends

Maybe you’ve recently seen a headline like this:

Actually, I’m more worried that you saw that headline before it was edited, when it looked like this:

If you’ve seen either headline, and haven’t read anything else about it, then please at least read this:

Physicists have not created an actual wormhole. They have simulated a wormhole on a quantum computer.

If you’re willing to read more, then read the rest of this post. There’s a more subtle story going on here, both about physics and about how we communicate it. And for the experts, hold on, because when I say the wormhole was a simulation I’m not making the same argument everyone else is.

[And for the mega-experts, there’s an edit later in the post where I soften that claim a bit.]

The headlines at the top of this post come from an article in Quanta Magazine. Quanta is a web-based magazine covering many fields of science. They’re read by the general public, but they aim for a higher standard than many science journalists, with stricter fact-checking and a goal of covering more challenging and obscure topics. Scientists in turn have tended to be quite happy with them: often, they cover things we feel are important but that the ordinary media isn’t able to cover. (I even wrote something for them recently.)

Last week, Quanta published an article about an experiment with Google’s Sycamore quantum computer. By arranging the quantum bits (qubits) in a particular way, they were able to observe behaviors one would expect out of a wormhole, a kind of tunnel linking different points in space and time. They published it with the second headline above, claiming that physicists had created a wormhole with a quantum computer and explaining how, using a theoretical picture called holography.

This pissed off a lot of physicists. After push-back, Quanta’s twitter account published this statement, and they added the word “Holographic” to the title.

Why were physicists pissed off?

It wasn’t because the Quanta article was wrong, per se. As far as I’m aware, all the technical claims they made are correct. Instead, it was about two things. One was the title, and the implication that physicists “really made a wormhole”. The other was the tone, the excited “breaking news” framing complete with a video comparing the experiment with the discovery of the Higgs boson. I’ll discuss each in turn:

The Title

Did physicists really create a wormhole, or did they simulate one? And why would that be at all confusing?

The story rests on a concept from the study of quantum gravity, called holography. Holography is the idea that in quantum gravity, certain gravitational systems like black holes are fully determined by what happens on a “boundary” of the system, like the event horizon of a black hole. It’s supposed to be a hologram in analogy to 3d images encoded in 2d surfaces, rather than like the hard-light constructions of science fiction.

The best-studied version of holography is something called AdS/CFT duality. AdS/CFT duality is a relationship between two different theories. One of them is a CFT, or “conformal field theory”, a type of particle physics theory with no gravity and no mass. (The first example of the duality used my favorite toy theory, N=4 super Yang-Mills.) The other one is a version of string theory in an AdS, or anti-de Sitter space, a version of space-time curved so that objects shrink as they move outward, approaching a boundary. (In the first example, this space-time had five dimensions curled up in a sphere and the rest in the anti-de Sitter shape.)

These two theories are conjectured to be “dual”. That means that, for anything that happens in one theory, you can give an alternate description using the other theory. We say the two theories “capture the same physics”, even though they appear very different: they have different numbers of dimensions of space, and only one has gravity in it.

Many physicists would claim that if two theories are dual, then they are both “equally real”. Even if one description is more familiar to us, both descriptions are equally valid. Many philosophers are skeptical, but honestly I think the physicists are right about this one. Philosophers try to figure out which things are real or not real, to make a list of real things and explain everything else as made up of those in some way. I think that whole project is misguided, that it’s clarifying how we happen to talk rather than the nature of reality. In my mind, dualities are some of the clearest evidence that this project doesn’t make any sense: two descriptions can look very different, but in a quite meaningful sense be totally indistinguishable.

That’s the sense in which Quanta and Google and the string theorists they’re collaborating with claim that physicists have created a wormhole. They haven’t created a wormhole in our own space-time, one that, were it bigger and more stable, we could travel through. It isn’t progress towards some future where we actually travel the galaxy with wormholes. Rather, they created some quantum system, and that system’s dual description is a wormhole. That’s a crucial point to remember: even if they created a wormhole, it isn’t a wormhole for you.

If that were the end of the story, this post would still be full of warnings, but the title would be a bit different. It was going to be “Dual Wormholes for My Real Friends, Real Wormholes for My Dual Friends”. But there’s a list of caveats. Most of them arguably don’t matter, but the last was what got me to change the word “dual” to “simulated”.

  1. The real world is not described by N=4 super Yang-Mills theory. N=4 super Yang-Mills theory was never intended to describe the real world. And while the real world may well be described by string theory, those strings are not curled up around a five-dimensional sphere with the remaining dimensions in anti-de Sitter space. We can’t create either theory in a lab either.
  2. The Standard Model probably has a quantum gravity dual too, see this cute post by Matt Strassler. But they still wouldn’t have been able to use that to make a holographic wormhole in a lab.
  3. Instead, they used a version of AdS/CFT with fewer dimensions. It relates a weird form of gravity in one space and one time dimension (called JT gravity), to a weird quantum mechanics theory called SYK, with an infinite number of quantum particles or qubits. This duality is a bit more conjectural than the original one, but still reasonably well-established.
  4. Quantum computers don’t have an infinite number of qubits, so they had to use a version with a finite number: seven, to be specific. They trimmed the model down so that it would still show the wormhole-dual behavior they wanted. At this point, you might say that they’re definitely just simulating the SYK theory, using a small number of qubits to simulate the infinite number. But I think they could argue that this system, too, has a quantum gravity dual. The dual would have to be even weirder than JT gravity, and even more conjectural, but the signs of wormhole-like behavior they observed (mostly through simulations on an ordinary computer, which is still better at this kind of thing than a quantum computer) could be seen as evidence that this limited theory has its own gravity partner, with its own “real dual” wormhole.
  5. But those seven qubits don’t just have the interactions they were programmed to have, the ones with the dual. They are physical objects in the real world, so they interact with all of the forces of the real world. That includes, though very weakly, the force of gravity.

And that’s where I think things break, and you have to call the experiment a simulation. You can argue, if you really want to, that the seven-qubit SYK theory has its own gravity dual, with its own wormhole. There are people who expect duality to be broad enough to include things like that.

But you can’t argue that the seven-qubit SYK theory, plus gravity, has its own gravity dual. Theories that already have gravity are not supposed to have gravity duals. If you pushed hard enough on any of the string theorists on that team, I’m pretty sure they’d admit that.

That is what decisively makes the experiment a simulation. It approximately behaves like a system with a dual wormhole, because you can approximately ignore gravity. But if you’re making some kind of philosophical claim, that you “really made a wormhole”, then “approximately” doesn’t cut it: if you don’t exactly have a system with a dual, then you don’t “really” have a dual wormhole: you’ve just simulated one.

Edit: mitchellporter in the comments points out something I didn’t know: that there are in fact proposals for gravity theories with gravity duals. They are in some sense even more conjectural than the series of caveats above, but at minimum my claim above, that any of the string theorists on the team would agree that the system’s gravity means it can’t have a dual, is probably false.

I think at this point, I’d soften my objection to the following:

Describing the system of qubits in the experiment as a limited version of the SYK theory is in one way or another an approximation. It approximates them as not having any interactions beyond those they programmed, it approximates them as not affected by gravity, and because it’s a quantum mechanical description it even approximates the speed of light as small. Those approximations don’t guarantee that the system doesn’t have a gravity dual. But in order for them to, then our reality, overall, would have to have a gravity dual. There would have to be a dual gravity interpretation of everything, not just the inside of Google’s quantum computer, and it would have to be exact, not just an approximation. Then the approximate SYK would be dual to an approximate wormhole, but that approximate wormhole would be an approximation of some “real” wormhole in the dual space-time.

That’s not impossible, as far as I can tell. But it piles conjecture upon conjecture upon conjecture, to the point that I don’t think anyone has explicitly committed to the whole tower of claims. If you want to believe that this experiment literally created a wormhole, you thus can, but keep in mind the largest asterisk known to mankind.

End edit.

If it weren’t for that caveat, then I would be happy to say that the physicists really created a wormhole. It would annoy some philosophers, but that’s a bonus.

But even if that were true, I wouldn’t say that in the title of the article.

The Title, Again

These days, people get news in two main ways.

Sometimes, people read full news articles. Reading that Quanta article is a good way to understand the background of the experiment, what was done and why people care about it. As I mentioned earlier, I don’t think anything said there was wrong, and they cover essentially all of the caveats you’d care about (except for that last one 😉 ).

Sometimes, though, people just see headlines. They get forwarded on social media, observed at a glance passed between friends. If you’re popular enough, then many more people will see your headline than will actually read the article. For many people, their whole understanding of certain scientific fields is formed by these glancing impressions.

Because of that, if you’re popular and news-y enough, you have to be especially careful with what you put in your headlines, especially when it implies a cool science fiction story. People will almost inevitably see them out of context, and it will impact their view of where science is headed. In this case, the headline may have given many people the impression that we’re actually making progress towards travel via wormholes.

Some of my readers might think this is ridiculous, that no-one would believe something like that. But as a kid, I did. I remember reading popular articles about wormholes, describing how you’d need energy moving in a circle, and other articles about optical physicists finding ways to bend light and make it stand still. Putting two and two together, I assumed these ideas would one day merge, allowing us to travel to distant galaxies faster than light.

If I had seen Quanta’s headline at that age, I would have taken it as confirmation. I would have believed we were well on the way to making wormholes, step by step. Even the New York Times headline, “the Smallest, Crummiest Wormhole You Can Imagine”, wouldn’t have fazed me.

(I’m not sure even the extra word “holographic” would have. People don’t know what “holographic” means in this context, and while some of them would assume it meant “fake”, others would think about the many works of science fiction, like Star Trek, where holograms can interact physically with human beings.)

Quanta has a high-brow audience, many of whom wouldn’t make this mistake. Nevertheless, I think Quanta is popular enough, and respectable enough, that they should have done better here.

At minimum, they could have used the word “simulated”. Even if they go on to argue in the article that the wormhole is real, and not just a simulation, the word in the title does no real harm. It would be a lie, but a beneficial “lie to children”, the basic stock-in-trade of science communication. I think they could have defended it to the string theorists they interviewed on those grounds.

The Tone

Honestly, I don’t think people would have been nearly so pissed off were it not for the tone of the article. There are a lot of physics bloggers who view themselves as serious-minded people, opposed to hype and publicity stunts. They view the research program aimed at simulating quantum gravity on a quantum computer as just an attempt to link a dying and un-rigorous research topic to an over-hyped and over-funded one, pompous storytelling aimed at promoting the careers of people who are already extremely successful.

These people tend to view Quanta favorably, because it covers serious-minded topics in a thorough way. And so many of them likely felt betrayed, seeing this Quanta article as a massive failure of that serious-minded-ness, falling for or even endorsing the hypiest of hype.

To those people, I’d like to politely suggest you get over yourselves.

Quanta’s goal is to cover things accurately, to represent all the facts in a way people can understand. But “how exciting something is” is not a fact.

Excitement is subjective. Just because most of the things Quanta finds exciting you also find exciting, does not mean that Quanta will find the things you find unexciting unexciting. Quanta is not on “your side” in some war against your personal notion of unexciting science, and you should never have expected it to be.

In fact, Quanta tends to find things exciting, in general. They were more excited than I was about the amplituhedron, and I’m an amplitudeologist. Part of what makes them consistently excited about the serious-minded things you appreciate them for is that they listen to scientists and get excited about the things they’re excited about. That is going to include, inevitably, things those scientists are excited about for what you think are dumb groupthinky hype reasons.

I think the way Quanta titled the piece was unfortunate, and probably did real damage. I think the philosophical claim behind the title is wrong, though for subtle and weird enough reasons that I don’t really fault anybody for ignoring them. But I don’t think the tone they took was a failure of journalistic integrity or research or anything like that. It was a matter of taste. It’s not my taste, it’s probably not yours, but we shouldn’t have expected Quanta to share our tastes in absolutely everything. That’s just not how taste works.

This Week at Quanta Magazine

I’ve got an article in Quanta Magazine this week, about a program called FORM.

Quanta has come up a number of times on this blog, they’re a science news outlet set up by the Simons Foundation. Their goal is to enhance the public understanding of science and mathematics. They cover topics other outlets might find too challenging, and they cover the topics others cover with more depth. Most people I know who’ve worked with them have been impressed by their thoroughness: they take fact-checking to a level I haven’t seen with other science journalists. If you’re doing a certain kind of mathematical work, then you hope that Quanta decides to cover it.

A while back, as I was chatting with one of their journalists, I had a startling realization: if I want Quanta to cover something, I can send them a tip, and if they’re interested they’ll write about it. That realization resulted in the article I talked about here. Chatting with the journalist interviewing me for that article, though, I learned something if anything even more startling: if I want Quanta to cover something, and I want to write about it, I can pitch the article to Quanta, and if they’re interested they’ll pay me to write about it.

Around the same time, I happened to talk to a few people in my field, who had a problem they thought Quanta should cover. A software, called FORM, was used in all the most serious collider physics calculations. Despite that, the software wasn’t being supported: its future was unclear. You can read the article to learn more.

One thing I didn’t mention in that article: I hadn’t used FORM before I started writing it. I don’t do those “most serious collider physics calculations”, so I’d never bothered to learn FORM. I mostly use Mathematica, a common choice among physicists who want something easy to learn, even if it’s not the strongest option for many things.

(By the way, it was surprisingly hard to find quotes about FORM that didn’t compare it specifically to Mathematica. In the end I think I included one, but believe me, there could have been a lot more.)

Now, I wonder if I should have been using FORM all along. Many times I’ve pushed to the limits of what Mathematica could comfortable handle, the limits of what my computer’s memory could hold, equations long enough that just expanding them out took complicated work-arounds. If I had learned FORM, maybe I would have breezed through those calculations, and pushed even further.

I’d love it if this article gets FORM more attention, and more support. But also, I’d love it if it gives a window on the nuts and bolts of hard-core particle physics: the things people have to do to turn those T-shirt equations into predictions for actual colliders. It’s a world in between physics and computer science and mathematics, a big part of the infrastructure of how we know what we know that, precisely because it’s infrastructure, often ends up falling through the cracks.

Edit: For researchers interested in learning more about FORM, the workshop I mentioned at the end of the article is now online, with registrations open.