Author Archives: 4gravitons

arXiv, Our Printing Press

IMG_20160714_091400

Johannes Gutenberg, inventor of the printing press, and possibly the only photogenic thing on the Mainz campus

I’ve had a few occasions to dig into older papers recently, and I’ve noticed a trend: old papers are hard to read!

Ok, that might not be surprising. The older a paper is, the greater the chance it will use obsolete notation, or assume a context that has long passed by. Older papers have different assumptions about what matters, or what rigor requires, and their readers cared about different things. All this is to be expected: a slow, gradual approach to a modern style and understanding.

I’ve been noticing, though, that this slow, gradual approach doesn’t always hold. Specifically, it seems to speed up quite dramatically at one point: the introduction of arXiv, the website where we store all our papers.

Part of this could just be a coincidence. As it happens, the founding papers in my subfield, those that started Amplitudes with a capital “A”, were right around the time that arXiv first got going. It could be that all I’m noticing is the difference between Amplitudes and “pre-Amplitudes”, with the Amplitudes subfield sharing notation more than they did before they had a shared identity.

But I suspect that something else is going on. With arXiv, we don’t just share papers (that was done, piecemeal, before arXiv). We also share LaTeX.

LaTeX is a document formatting language, like a programming language for papers. It’s used pretty much universally in physics and math, and increasingly in other fields. As it turns out, when we post a paper to arXiv, we don’t just send a pdf: we include the raw LaTeX code as well.

Before arXiv, if you wanted to include an equation from another paper, you’d format it yourself. You’d probably do it a little differently from the other paper, in accord with your own conventions, and just to make it easier on yourself. Over time, more and more differences would crop up, making older papers harder and harder to read.

With arXiv, you can still do all that. But you can also just copy.

Since arXiv makes the LaTeX code behind a paper public, it’s easy to lift the occasional equation. Even if you’re not lifting it directly, you can see how they coded it. Even if you don’t plan on copying, the default gets flipped around: instead of having to try to make your equation like the one in the previous paper and accidentally getting it wrong, every difference is intentional.

This reminds me, in a small-scale way, of the effect of the printing press on anatomy books.

Before the printing press, books on anatomy tended to be full of descriptions, but not illustrations. Illustrations weren’t reliable: there was no guarantee the monk who copied them would do so correctly, so nobody bothered. This made it hard to tell when an anatomist (fine it was always Galen) was wrong: he could just be using an odd description. It was only after the printing press that books could actually have illustrations that were reliable across copies of a book. Suddenly, it was possible to point out that a fellow anatomist had left something out: it would be missing from the illustration!

In a similar way, arXiv seems to have led to increasingly standard notation. We still aren’t totally consistent…but we do seem a lot more consistent than older papers, and I think arXiv is the reason why.

Thought Experiments, Minus the Thought

My second-favorite Newton fact is that, despite inventing calculus, he refused to use it for his most famous work of physics, the Principia. Instead, he used geometrical proofs, tweaked to smuggle in calculus without admitting it.

Essentially, these proofs were thought experiments. Newton would start with a standard geometry argument, one that would have been acceptable to mathematicians centuries earlier. Then, he’d imagine taking it further, pushing a line or angle to some infinite point. He’d argue that, if the proof worked for every finite choice, then it should work in the infinite limit as well.

These thought experiments let Newton argue on the basis of something that looked more rigorous than calculus. However, they also held science back. At the time, only a few people in the world could understand what Newton was doing. It was only later, when Newton’s laws were reformulated in calculus terms, that a wider group of researchers could start doing serious physics.

What changed? If Newton could describe his physics with geometrical thought experiments, why couldn’t everyone else?

The trouble with thought experiments is that they require careful setup, setup that has to be thought through for each new thought experiment. Calculus took Newton’s geometrical thought experiments, and took out the need for thought: the setup was automatically a part of calculus, and each new researcher could build on their predecessors without having to set everything up again.

This sort of thing happens a lot in science. An example from my field is the scattering matrix, or S-matrix.

The S-matrix, deep down, is a thought experiment. Take some particles, and put them infinitely far away from each other, off in the infinite past. Then, let them approach, close enough to collide. If they do, new particles can form, and these new particles will travel out again, infinite far away in the infinite future. The S-matrix then is a metaphorical matrix that tells you, for each possible set of incoming particles, what the probability is to get each possible set of outgoing particles.

In a real collider, the particles don’t come from infinitely far away, and they don’t travel infinitely far before they’re stopped. But the distances are long enough, compared to the sizes relevant for particle physics, that the S-matrix is the right idea for the job.

Like calculus, the S-matrix is a thought experiment minus the thought. When we want to calculate the probability of particles scattering, we don’t need to set up the whole thought experiment all over again. Instead, we can start by calculating, and over time we’ve gotten very good at it.

In general, sub-fields in physics can be divided into those that have found their S-matrices, their thought experiments minus thought, and those that have not. When a topic has to rely on thought experiments, progress is much slower: people argue over the details of each setup, and it’s difficult to build something that can last. It’s only when a field turns the corner, removing the thought from its thought experiments, that people can start making real collaborative progress.

Still Traveling

I’m still traveling this week, so this will  be a short post.

Last year, when I went to Amplitudes I left Europe right after. This felt like a bit of a waste: an expensive, transcontinental flight, and I was only there for a week?

So this year, I resolved to visit a few more places. I was at the Niels Bohr Institute in Copenhagen earlier this week.

IMG_20160712_205034_hdr

Where the live LHC collisions represented as lights shining on the face of the building are rather spoiled by the lack of any actual darkness to see them by.

Now, I’m at Mainz, visiting Johannes Henn.

Oddly enough, I’ve got family connections to both places. My great-grandfather spent some time at the Niels Bohr Institute on his way out of Europe, and I have a relative who works at Mainz. So while the primary purpose of this trip was research, I’ve gotten to learn a little family history in the process.

Amplitudes 2016

I’m at Amplitudes this week, in Stockholm.

IMG_20160704_225049

The land of twilight at 11pm

Last year, I wrote a post giving a tour of the field. If I had to write it again this year most of the categories would be the same, but the achievements listed would advance in loops and legs, more complicated theories and more insight.

The ambitwistor string now goes to two loops, while my collaborators and I have pushed the polylogarithm program to five loops (dedicated post on that soon!) A decent number of techniques can now be applied to QCD, including a differential equation-based method that was used to find a four loop, three particle amplitude. Others tied together different approaches, found novel structures in string theory, or linked amplitudes techniques to physics from other disciplines. The talks have been going up on YouTube pretty quickly, due to diligent work by Nordita’s tech guy, so if you’re at all interested check it out!

The (but I’m Not a) Crackpot Style Guide

Ok, ok, I believe you. You’re not a crackpot. You’re just an outsider, one with a brilliant new idea that would overturn the accepted paradigms of physics, if only someone would just listen.

Here’s the problem: you’re not alone. There are plenty of actual crackpots. We get contacted by them fairly regularly. And most of the time, they’re frustrating and unpleasant to deal with.

If you want physicists to listen to you, you need to show us you’re not one of those people. Otherwise, most of us won’t bother.

I can’t give you a foolproof way to do that. But I can give some suggestions that will hopefully make the process a little less frustrating for everyone involved.

Don’t spam:

Nobody likes spam. Nobody reads spam. If you send a mass email to every physicist whose email address you can find, none of them will read it. If you repeatedly post the same thing in a comment thread, nobody will read it. If you want people to listen to you, you have to show that you care about what they have to say, and in order to do that you have to tailor your message. This leads in to the next point,

Ask the right people:

Before you start reaching out, you should try to get an idea of who to talk to. Physics is quite specialized, so if you’re taking your ideas seriously you should try to contact people with a relevant specialization.

Now, I know what you’re thinking: your ideas are unique, no-one in physics is working on anything similar.

Here, it’s important to distinguish the problem you’re trying to solve with how you’re trying to solve it. Chances are, no-one else is working on your specific idea…but plenty of people are interested in the same problems.

Think quantum mechanics is built on shoddy assumptions? There are people who spend their lives trying to modify quantum mechanics. Have a beef against general relativity? There’s a whole sub-field of people who modify gravity.

These people are a valuable resource for you, because they know what doesn’t work. They’ve been trying to change the system, and they know just how hard it is to change, and just what evidence you need to be consistent with.

Contacting someone whose work just uses quantum mechanics or relativity won’t work. If you’re making elementary mistakes, we can put you on the right track…but if you think you’re making elementary mistakes, you should start out by asking help from a forum or the like, not contacting a professional. If you think you’ve really got a viable replacement to an established idea, you need to contact people who work on overturning established ideas, since they’re most aware of the complicated webs of implications involved. Relatedly,

Take ownership of your work:

I don’t know how many times someone has “corrected” something in the comments, and several posts later admitted that the “correction” comes from their own theory. If you’re arguing from your own work, own it! If you don’t, people will assume you’re trying to argue from an established theory, and are just confused about how that theory works. This is a special case of a broader principle,

Epistemic humility:

I’m not saying you need to be humble in general, but if you want to talk productively you need to be epistemically humble. That means being clear about why you know what you know. Did you get it from a mathematical proof? A philosophical argument? Reading pop science pieces? Something you remember from high school? Being clear about your sources makes it easier for people to figure out where you’re coming from, and avoids putting your foot in your mouth if it turns out your source is incomplete.

Context is crucial:

If you’re commenting on a blog like this one, pay attention to context. Your comment needs to be relevant enough that people won’t parse it as spam.

If all a post does is mention something like string theory, crowing about how your theory is a better explanation for quantum gravity isn’t relevant. Ditto for if all it does is mention a scientific concept that you think is mistaken.

What if the post is promoting something that you’ve found to be incorrect, though? What if someone is wrong on the internet?

In that case, it’s important to keep in mind the above principles. A popularization piece will usually try to present the establishment view, and merits a different response than a scientific piece arguing something new. In both cases, own your own ideas and be specific about how you know what you know. Be clear on whether you’re talking about something that’s controversial, or something that’s broadly agreed on.

You can get an idea of what works and what doesn’t by looking at comments on this blog. When I post about dark matter, or cosmic inflation, there are people who object, and the best ones are straightforward about why. Rather than opening with “you’re wrong”, they point out which ideas are controversial. They’re specific about whose ideas they’re referencing, and are clear about what is pedagogy and what is science.

Those comments tend to get much better responses than the ones that begin with cryptic condemnations, follow with links, and make absolute statements without backing them up.

On the internet, it’s easy for misunderstandings to devolve into arguments. Want to avoid that? Be direct, be clear, be relevant.

Book Review: The Invention of Science

I don’t get a lot of time to read for pleasure these days. When I do, it’s usually fiction. But I’ve always had a weakness for stories from the dawn of science, and David Wootton’s The Invention of Science: A New History of the Scientific Revolution certainly fit the bill.

517hucfpasl-_sx329_bo1204203200_

Wootton’s book is a rambling tour of the early history of science, from Brahe’s nova in 1572 to Newton’s Optics in 1704. Tying everything together is one clear, central argument: that the scientific revolution involved, not just a new understanding of the world, but the creation of new conceptual tools. In other words, the invention of science itself.

Wootton argues this, for the most part, by tracing changes in language. Several chapters have a common structure: Wootton identifies a word, like evidence or hypothesis, that has an important role in how we talk about science. He then tracks that word back to its antecedents, showing how early scientists borrowed and coined the words they needed to describe the new type of reasoning they had pioneered.

Some of the most compelling examples come early on. Wootton points out that the word “discover” only became common in European languages after Columbus’s discovery of the new world: first in Portugese, then later in the rest of Europe. Before then, the closest term meant something more like “find out”, and was ambiguous: it could refer to finding something that was already known to others. Thus, early writers had to use wordy circumlocutions like “found out that which was not known before” to refer to genuine discovery.

The book covers the emergence of new social conventions in a similar way. For example, I was surprised to learn that the first recorded priority disputes were in the sixteenth century. Before then, discoveries weren’t even typically named for their discoverers: “the Pythagorean theorem”, oddly enough, is a name that wasn’t used until after the scientific revolution was underway. Beginning with explorers arguing over the discovery of the new world and anatomists negotiating priority for identifying the bones of the ear or the “discovery” of the clitoris, the competitive element of science began to come into its own.

Along the way, Wootton highlights episodes both familiar and obscure. You’ll find Bruno and Torricelli, yes, but also disputes over whether the seas are higher than the land or whether a weapon could cure wounds it caused via the power of magnetism. For anyone as fascinated by the emergence of science as I am, it’s a joyous wealth of detail.

If I had one complaint, it would be that for a lay reader far too much of Wootton’s book is taken up by disputes with other historians. His particular foes are relativists, though he spares some paragraphs to attack realists too. Overall, his dismissals of his opponents are so pat, and his descriptions of their views so self-evidently silly, that I can’t help but suspect that he’s not presenting them fairly. Even if he is, the discussion is rather inside baseball for a non-historian like me.

I read part of Newton’s Principia in college, and I was hoping for a more thorough discussion of Newton’s role. While he does show up, Wootton seems to view Newton as a bit of an enigma: someone who insisted on using the old language of geometric proofs while clearly mastering the new science of evidence and experiment. In this book, Newton is very much a capstone, not a focus.

Overall, The Invention of Science is a great way to learn about the twists and turns of the scientific revolution. If you set aside the inter-historian squabbling (or if you like that sort of thing) you’ll find a book brim full of anecdotes from the dawn of modern thought, and a compelling argument that what we do as scientists is neither an accident of culture nor obvious common-sense, but a hard-won invention whose rewards we are still reaping today.

Most of String Theory Is Not String Pheno

Last week, Sabine Hossenfelder wrote a post entitled “Why not string theory?” In it, she argued that string theory has a much more dominant position in physics than it ought to: that it’s crowding out alternative theories like Loop Quantum Gravity and hogging much more funding than it actually merits.

If you follow the string wars at all, you’ve heard these sorts of arguments before. There’s not really anything new here.

That said, there were a few sentences in Hossenfelder’s post that got my attention, and inspired me to write this post.

So far, string theory has scored in two areas. First, it has proved interesting for mathematicians. But I’m not one to easily get floored by pretty theorems – I care about math only to the extent that it’s useful to explain the world. Second, string theory has shown to be useful to push ahead with the lesser understood aspects of quantum field theories. This seems a fruitful avenue and is certainly something to continue. However, this has nothing to do with string theory as a theory of quantum gravity and a unification of the fundamental interactions.

(Bolding mine)

Here, Hossenfelder explicitly leaves out string theorists who work on “lesser understood aspects of quantum field theories” from her critique. They’re not the big, dominant program she’s worried about.

What Hossenfelder doesn’t seem to realize is that right now, it is precisely the “aspects of quantum field theories” crowd that is big and dominant. The communities of string theorists working on something else, and especially those making bold pronouncements about the nature of the real world, are much, much smaller.

Let’s define some terms:

Phenomenology (or pheno for short) is the part of theoretical physics that attempts to make predictions that can be tested in experiments. String pheno, then, covers attempts to use string theory to make predictions. In practice, though, it’s broader than that: while some people do attempt to predict the results of experiments, more work on figuring out how models constructed by other phenomenologists can make sense in string theory. This still attempts to test string theory in some sense: if a phenomenologist’s model turns out to be true but it can’t be replicated in string theory then string theory would be falsified. That said, it’s more indirect. In parallel to string phenomenology, there is also the related field of string cosmology, which has a similar relationship with cosmology.

If other string theorists aren’t trying to make predictions, what exactly are they doing? Well, a large number of them are studying quantum field theories. Quantum field theories are currently our most powerful theories of nature, but there are many aspects of them that we don’t yet understand. For a large proportion of string theorists, string theory is useful because it provides a new way to understand these theories in terms of different configurations of string theory, which often uncovers novel and unexpected properties. This is still physics, not mathematics: the goal, in the end, is to understand theories that govern the real world. But it doesn’t involve the same sort of direct statements about the world as string phenomenology or string cosmology: crucially, it doesn’t depend on whether string theory is true.

Last week, I said that before replying to Hossenfelder’s post I’d have to gather some numbers. I was hoping to find some statistics on how many people work on each of these fields, or on their funding. Unfortunately, nobody seems to collect statistics broken down by sub-field like this.

As a proxy, though, we can look at conferences. Strings is the premier conference in string theory. If something has high status in the string community, it will probably get a talk at Strings. So to investigate, I took a look at the talks given last year, at Strings 2015, and broke them down by sub-field.

strings2015topics

Here I’ve left out the historical overview talks, since they don’t say much about current research.

“QFT” is for talks about lesser understood aspects of quantum field theories. Amplitudes, my own sub-field, should be part of this: I’ve separated it out to show what a typical sub-field of the QFT block might look like.

“Formal Strings” refers to research into the fundamentals of how to do calculations in string theory: in principle, both the QFT folks and the string pheno folks find it useful.

“Holography” is a sub-topic of string theory in which string theory in some space is equivalent to a quantum field theory on the boundary of that space. Some people study this because they want to learn about quantum field theory from string theory, others because they want to learn about quantum gravity from quantum field theory. Since the field can’t be cleanly divided into quantum gravity and quantum field theory research, I’ve given it its own category.

While all string theory research is in principle about quantum gravity, the “Quantum Gravity” section refers to people focused on the sorts of topics that interest non-string quantum gravity theorists, like black hole entropy.

Finally, we have String Cosmology and String Phenomenology, which I’ve already defined.

Don’t take the exact numbers here too seriously: not every talk fit cleanly into a category, so there were some judgement calls on my part. Nonetheless, this should give you a decent idea of the makeup of the string theory community.

The biggest wedge in the diagram by far, taking up a majority of the talks, is QFT. Throwing in Amplitudes (part of QFT) and Formal Strings (useful to both), and you’ve got two thirds of the conference. Even if you believe Hossenfelder’s tale of the failures of string theory, then, that only matters to a third of this diagram. And once you take into account that many of the Holography and Quantum Gravity people are interested in aspects of QFT as well, you’re looking at an even smaller group. Really, Hossenfelder’s criticism is aimed at two small slices on the chart: String Pheno, and String Cosmo.

Of course, string phenomenologists also have their own conference. It’s called String Pheno, and last year it had 130 participants. In contrast, LOOPS’ 2015, the conference for string theory’s most famous “rival”, had…190 participants. The fields are really pretty comparable.

Now, I have a lot more sympathy for the string phenomenologists and string cosmologists than I do for loop quantum gravity. If other string theorists felt the same way, then maybe that would cause the sort of sociological effect that Hossenfelder is worried about.

But in practice, I don’t think this happens. I’ve met string theorists who didn’t even know that people still did string phenomenology. The two communities are almost entirely disjoint: string phenomenologists and string cosmologists interact much more with other phenomenologists and cosmologists than they do with other string theorists.

You want to talk about sociology? Sociologically, people choose careers and fund research because they expect something to happen soon. People don’t want to be left high and dry by a dearth of experiments, don’t feel comfortable working on something that may only be vindicated long after they’re dead. Most people choose the safe option, the one that, even if it’s still aimed at a distant goal, is also producing interesting results now (aspects of quantum field theories, for example).

The people that don’t? Tend to form small, tight-knit, passionate communities. They carve out a few havens of like-minded people, and they think big thoughts while the world around them seems to only care about their careers.

If you’re a loop quantum gravity theorist, or a quantum gravity phenomenologist like Hossenfelder, and you see some of your struggles in that paragraph, please realize that string phenomenology is like that too.

I feel like Hossenfelder imagines a world in which string theory is struck from its high place, and alternative theories of quantum gravity are of comparable size and power. But from where I’m sitting, it doesn’t look like it would work out that way. Instead, you’d have alternatives grow to the same size as similarly risky parts of string theory, like string phenomenology. And surprise, surprise: they’re already that size.

In certain corners of the internet, people like to argue about “punching up” and “punching down”. Hossenfelder seems to think she’s “punching up”, giving the big dominant group a taste of its own medicine. But by leaving out string theorists who study QFTs, she’s really “punching down”, or at least sideways, and calling out a sub-group that doesn’t have much more power than her own.

Quick Post

I’m traveling this week, so I don’t have time for a long post. I am rather annoyed with Sabine Hossenfelder’s recent post about string theory, but I don’t have time to write much about it now.

(Broadly speaking, she dismisses string theory’s success in investigating quantum field theories as irrelevant to string theory’s dominance, but as far as I’ve seen the only part of string theory that has any “institutional dominance” at all is the “investigating quantum field theories” part, while string theorists who spend their time making statements about the real world are roughly as “marginalized” as non-string quantum gravity theorists. But I ought to gather some numbers before I really commit to arguing this.)

Fun with Misunderstandings

Perimeter had its last Public Lecture of the season this week, with Mario Livio giving some highlights from his book Brilliant Blunders. The lecture should be accessible online, either here or on Perimeter’s YouTube page.

These lectures tend to attract a crowd of curious science-fans. To give them something to do while they’re waiting, a few local researchers walk around with T-shirts that say “Ask me, I’m a scientist!” Sometimes we get questions about the upcoming lecture, but more often people just ask us what they’re curious about.

Long-time readers will know that I find this one of the most fun parts of the job. In particular, there’s a unique challenge in figuring out just why someone asked a question. Often, there’s a hidden misunderstanding they haven’t recognized.

The fun thing about these misunderstandings is that they usually make sense, provided you’re working from the person in question’s sources. They heard a bit of this and a bit of that, and they come to the most reasonable conclusion they can given what’s available. For those of us who have heard a more complete story, this often leads to misunderstandings we would never have thought of, but that in retrospect are completely understandable.

One of the simpler ones I ran into was someone who was confused by people claiming that we were running out of water. How could there be a water shortage, he asked, if the Earth is basically a closed system? Where could the water go?

The answer is that when people are talking about a water shortage, they’re not talking about water itself running out. Rather, they’re talking about a lack of safe drinking water. Maybe the water is polluted, or stuck in the ocean without expensive desalinization. This seems like the sort of thing that would be extremely obvious, but if you just hear people complaining that water is running out without the right context then you might just not end up hearing that part of the story.

A more involved question had to do with time dilation in general relativity. The guy had heard that atomic clocks run faster if you’re higher up, and that this was because time itself runs faster in lower gravity.

Given that, he asked, what happens if someone travels to an area of low gravity and then comes back? If more time has passed for them, then they’d be in the future, so wouldn’t they be at the “wrong time” compared to other people? Would they even be able to interact with them?

This guy’s misunderstanding came from hearing what happens, but not why. While he got that time passes faster in lower gravity, he was still thinking of time as universal: there is some past, and some future, and if time passes faster for one person and slower for another that just means that one person is “skipping ahead” into the other person’s future.

What he was missing was the explanation that time dilation comes from space and time bending. Rather than “skipping ahead”, a person for whom time passes faster just experiences more time getting to the same place, because they’re traveling on a curved path through space-time.

As usual, this is easier to visualize in space than in time. I ended up drawing a picture like this:

IMG_20160602_101423

Imagine person A and person B live on a circle. If person B stays the same distance from the center while person A goes out further, they can both travel the same angle around the circle and end up in the same place, but A will have traveled further, even ignoring the trips up and down.

What’s completely intuitive in space ends up quite a bit harder to visualize in time. But if you at least know what you’re trying to think about, that there’s bending involved, then it’s easier to avoid this guy’s kind of misunderstanding. Run into the wrong account, though, and even if it’s perfectly correct (this guy had heard some of Hawking’s popularization work on the subject), if it’s not emphasizing the right aspects you can come away with the wrong impression.

Misunderstandings are interesting because they reveal how people learn. They’re windows into different thought processes, into what happens when you only have partial evidence. And because of that, they’re one of the most fascinating parts of science popularization.

Mass Is Just Energy You Haven’t Met Yet

How can colliding two protons give rise to more massive particles? Why do vibrations of a string have mass? And how does the Higgs work anyway?

There is one central misunderstanding that makes each of these topics confusing. It’s something I’ve brought up before, but it really deserves its own post. It’s people not realizing that mass is just energy you haven’t met yet.

It’s quite intuitive to think of mass as some sort of “stuff” that things can be made out of. In our everyday experience, that’s how it works: combine this mass of flour and this mass of sugar, and get this mass of cake. Historically, it was the dominant view in physics for quite some time. However, once you get to particle physics it starts to break down.

It’s probably most obvious for protons. A proton has a mass of 938 MeV/c², or 1.6×10⁻²⁷ kg in less physicist-specific units. Protons are each made of three quarks, two up quarks and a down quark. Naively, you’d think that the quarks would have to be around 300 MeV/c². They’re not, though: up and down quarks both have masses less than 10 MeV/c². Those three quarks account for less than a fiftieth of a proton’s mass.

The “extra” mass is because a proton is not just three quarks. It’s three quarks interacting. The forces between those quarks, the strong nuclear force that binds them together, involves a heck of a lot of energy. And from a distance, that energy ends up looking like mass.

This isn’t unique to protons. In some sense, it’s just what mass is.

The quarks themselves get their mass from the Higgs field. Far enough away, this looks like the quarks having a mass. However, zoom in and it’s energy again, the energy of interaction between quarks and the Higgs. In string theory, mass comes from the energy of vibrating strings. And so on. Every time we run into something that looks like a fundamental mass, it ends up being just another energy of interaction.

If mass is just energy, what about gravity?

When you’re taught about gravity, the story is all about mass. Mass attracts mass. Mass bends space-time. What gets left out, until you actually learn the details of General Relativity, is that energy gravitates too.

Normally you don’t notice this, because mass contributes so much more to energy than anything else. That’s really what E=m is really about: it’s a unit conversion formula. It tells you that if you want to know how much energy a given mass “really is”, you multiply it by the speed of light squared. And that’s a large enough number that most of the time, when you notice energy gravitating, it’s because that energy looks like a big chunk of mass. (It’s also why physicists like silly units like MeV/c² for mass: we can just multiply by c² and get an energy!)

It’s really tempting to think about mass as a substance, of mass as always conserved, of mass as fundamental. But in physics we often have to toss aside our everyday intuitions, and this is no exception. Mass really is just energy. It’s just energy that we’ve “zoomed out” enough not to notice.