Tag Archives: quantum field theory

String Theorists Who Don’t Touch Strings

This week I’ve been busy, attending a workshop here at Perimeter on Superstring Perturbation Theory.

Superstrings are the supersymmetric strings that string theorists use to describe fundamental particles, while perturbation theory is the trick, common in almost every area of physics, of solving a problem by a series of increasingly precise approximations.

Based on that description, you’d think that superstring perturbation theory would be a central topic in string theory research. You wouldn’t expect it to be the sort of thing only a few people at the top of the field dabble in. You definitely wouldn’t expect one of the speakers at the workshop to mention that this might be the first conference on superstring perturbation theory he’s been to since the 1980’s.

String perturbation theory is an important subject, but it’s not one many string theorists use. And the reason why is that, oddly enough, very few string theorists actually use strings.

Looking at arXiv as I’m writing this, I can see only one paper in the theoretical physics section that directly uses strings. Most of them use something else: either older concepts like black holes, quantum field theory, and supergravity, or newer ones like d-branes. If you talked to the people who wrote those papers, though, most of them would describe themselves as string theorists.

The reason for the disconnect is that string theory as a field is much more than just the study of strings. String theory is a ten-dimensional universe (or eleven with M theory), where different ways of twisting up some of the dimensions result in different apparent physics in the remaining ones. It’s got strings, but also higher-dimensional membranes (and in the eleven dimensions of M theory it only has membranes, not strings). It’s the recipe for a long list of exotic quantum field theories, and a list of possible relations between them. It’s a new way to look at geometry, to think about the intersection of the nature of space and the dynamics of what inhabits it.

If string theory were really just about strings, it likely wouldn’t have grown any bigger than its quantum gravity rivals, like Loop Quantum Gravity. String theory grew because it inspired research directions that went far afield, and far beyond its conceptual core.

That’s part of why most string theorists will be baffled if you insist that string theory needs proof, or that it’s not the right approach to quantum gravity. For most string theorists, it doesn’t matter whether we live in a stringy world, whether gravity might eventually be described by another model. For most string theorists, string theory is a tool, one that opened up fields of inquiry that don’t have much to do with predicting the output of the LHC or describing the early universe. Or, in many cases, actually using strings.

Want to Make Something New? Just Turn on the Lights.

Isn’t it weird that you can collide two protons, and get something else?

It wouldn’t be so weird if you collided two protons, and out popped a quark. After all, protons are made of quarks. But how, if you collide two protons together, do you get a tau, or the Higgs boson: things that not only aren’t “part of” protons, but are more massive than a proton by themselves?

It seems weird…but in a way, it’s not. When a particle releases another particle that wasn’t inside it to begin with, it’s actually not doing anything more special than an everyday light bulb.

Eureka!

How does a light bulb work?

You probably know the basics: when an electrical current enters the bulb, the electrons in the filament start to move. They heat the filament up, releasing light.

That probably seems perfectly ordinary. But ask yourself for a moment: where did the light come from?

Light is made up of photons, elementary particles in their own right. When you flip a light switch, where do the photons come from? Were they stored in the light bulb?

Silly question, right? You don’t need to “store” light in a light bulb: light bulbs transform one type of energy (electrical, or the movement of electrons) into another type of energy (light, or photons).

Here’s the thing, though: mass is just another type of energy.

I like to describe mass as “energy we haven’t met yet”. Einstein’s equation, E=mc^2, relates a particle’s mass to its “rest energy”, the energy it would have if it stopped moving around and sit still. Even when a particle seems to be sitting still from the outside, there’s still a lot going on, though. “Composite” particles like protons have powerful forces between their internal quarks, while particles like electrons interact with the Higgs field. These processes give the particle energy, even when it’s not moving, so from our perspective on the outside they’re giving the particle mass.

What does that mean for the protons at the LHC?

The protons at the LHC have a lot of kinetic energy: they’re going 99.9999991% of the speed of light! When they collide, all that energy has to go somewhere. Just like in a light bulb, the fast-moving particles will release their energy in another form. And while that some of that energy will add to the speed of the fragments, much of it will go into the mass and energy of new particles. Some of these particles will be photons, some will be tau leptons, or Higgs bosons…pretty much anything that the protons have enough energy to create.

So if you want to understand how to create new particles, you don’t need a deep understanding of the mysteries of quantum field theory. Just turn on the lights.

What Counts as a Fundamental Force?

I’m giving a presentation next Wednesday for Learning Unlimited, an organization that presents educational talks to seniors in Woodstock, Ontario. The talk introduces the fundamental forces and talks about Yang and Mills before moving on to introduce my work.

While practicing the talk today, someone from Perimeter’s outreach department pointed out a rather surprising missing element: I never mention gravity!

Most people know that there are four fundamental forces of nature. There’s Electromagnetism, there’s Gravity, there’s the Weak Nuclear Force, and there’s the Strong Nuclear Force.

Listed here by their most significant uses.

What ties these things together, though? What makes them all “fundamental forces”?

Mathematically, gravity is the odd one out here. Electromagnetism, the Weak Force, and the Strong Force all share a common description: they’re Yang-Mills forces. Gravity isn’t. While you can sort of think of it as a Yang-Mills force “squared”, it’s quite a bit more complicated than the Yang-Mills forces.

You might be objecting that the common trait of the fundamental forces is obvious: they’re forces! And indeed, you can write down a force law for gravity, and a force law for E&M, and umm…

[Mumble Mumble]

Ok, it’s not quite as bad as xkcd would have us believe. You can actually write down a force law for the weak force, if you really want to, and it’s at least sort of possible to talk about the force exerted by the strong interaction.

All that said, though, why are we thinking about this in terms of forces? Forces are a concept from classical mechanics. For a beginning physics student, they come up again and again, in free-body diagram after free-body diagram. But by the time a student learns quantum mechanics, and quantum field theory, they’ve already learned other ways of framing things where forces aren’t mentioned at all. So while forces are kind of familiar to people starting out, they don’t really match onto anything that most quantum field theorists work with, and it’s a bit weird to classify things that only really appear in quantum field theory (the Weak Nuclear Force, the Strong Nuclear Force) based on whether or not they’re forces.

Isn’t there some connection, though? After all, gravity, electromagnetism, the strong force, and the weak force may be different mathematically, but at least they all involve bosons.

Well, yes. And so does the Higgs.

The Higgs is usually left out of listings of the fundamental forces, because it’s not really a “force”. It doesn’t have a direction, instead it works equally at every point in space. But if you include spin 2 gravity and spin 1 Yang-Mills forces, why not also include the spin 0 Higgs?

Well, if you’re doing that, why not include fermions as well? People often think of fermions as “matter” and bosons as “energy”, but in fact both have energy, and neither is made of it. Electrons and quarks are just as fundamental as photons and gluons and gravitons, just as central a part of how the universe works.

I’m still trying to decide whether my presentation about Yang-Mills forces should also include gravity. On the one hand, it would make everything more familiar. On the other…pretty much this entire post.

Explanations of Phenomena Are All Alike; Every Unexplained Phenomenon Is Unexplained in Its Own Way

Vladimir Kazakov began his talk at ICTP-SAIFR this week with a variant of Tolstoy’s famous opening to the novel Anna Karenina: “Happy families are all alike; every unhappy family is unhappy in its own way.” Kazakov flipped the order of the quote, stating that while “Un-solvable models are each un-solvable in their own way, solvable models are all alike.”

In talking about solvable and un-solvable models, Kazakov was referring to a concept called integrability, the idea that in certain quantum field theories it’s possible to avoid the messy approximations of perturbation theory and instead jump straight to the answer. Kazakov was observing that these integrable systems seem to have a deep kinship: the same basic methods appear to work to understand all of them.

I’d like to generalize Kazakov’s point, and talk about a broader trend in physics.

Much has been made over the years of the “unreasonable effectiveness of mathematics in the natural sciences”, most notably in physicist Eugene Wigner’s famous essay, The Unreasonable Effectiveness of Mathematics in the Natural Sciences. There’s a feeling among some people that mathematics is much better at explaining physical phenomena than one would expect, that the world appears to be “made of math” and that it didn’t have to be.

On the surface, this is a reasonable claim. Certain mathematical ideas, group theory for example, seem to pop up again and again in physics, sometimes in wildly different contexts. The history of fundamental physics has tended to see steady progress over the years, from clunkier mathematical concepts to more and more elegant ones.

Some physicists tend to be dismissive of this. Lee Smolin in particular seems to be under the impression that mathematics is just particularly good at providing useful approximations. This perspective links to his definition of mathematics as “the study of systems of evoked relationships inspired by observations of nature,” a definition to which Peter Woit vehemently objects. Woit argues what I think any mathematician would when presented by a statement like Smolin’s: that mathematics is much more than just a useful tool for approximating observations, and that contrary to physicists’ vanity most of mathematics goes on without any explicit interest in observing the natural world.

While it’s generally rude for physicists to propose definitions for mathematics, I’m going to do so anyway. I think the following definition is one mathematicians would be more comfortable with, though it may be overly broad: Mathematics is the study of simple rules with complex consequences.

We live in a complex world. The breadth of the periodic table, the vast diversity of life, the tangled webs of galaxies across the sky, these are things that display both vast variety and a sense of order. They are, in a rather direct way, the complex consequences of rules that are at heart very very simple.

Part of the wonder of modern mathematics is how interconnected it has become. Many sub-fields, once distinct, have discovered over the years that they are really studying different aspects of the same phenomena. That’s why when you see a proof of a three-hundred-year-old mathematical conjecture, it uses terms that seem to have nothing to do with the original problem. It’s why Woit, in an essay on this topic, quotes Edward Frenkel’s description of a particular recent program as a blueprint for a “Grand Unified Theory of Mathematics”. Increasingly, complex patterns are being shown to be not only consequences of simple rules, but consequences of the same simple rules.

Mathematics itself is “unreasonably effective”. That’s why, when faced with a complex world, we shouldn’t be surprised when the same simple rules pop up again and again to explain it. That’s what explaining something is: breaking down something complex into the simple rules that give rise to it. And as mathematics progresses, it becomes more and more clear that a few closely related types of simple rules lie behind any complex phenomena. While each unexplained fact about the universe may seem unexplained in its own way, as things are explained bit by bit they show just how alike they really are.

The Real Problem with Fine-Tuning

You’ve probably heard it said that the universe is fine-tuned.

The Standard Model, our current best understanding of the rules that govern particle physics, is full of lots of fiddly adjustable parameters. The masses of fundamental particles and the strengths of the fundamental forces aren’t the sort of thing we can predict from first principles: we need to go out, do experiments, and find out what they are. And you’ve probably heard it argued that, if these fiddly parameters were even a little different from what they are, life as we know it could not exist.

That’s fine-tuning…or at least, that’s what many people mean when they talk about fine-tuning. It’s not exactly what physicists mean though. The thing is, almost nobody who studies particle physics thinks the parameters of the Standard Model are the full story. In fact, any theory with adjustable parameters probably isn’t the full story.

It all goes back to a point I made a while back: nature abhors a constant. The whole purpose of physics is to explain the natural world, and we have a long history of taking things that look arbitrary and linking them together, showing that reality has fewer parameters than we had thought. This is something physics is very good at. (To indulge in a little extremely amateurish philosophy, it seems to me that this is simply an inherent part of how we understand the world: if we encounter a parameter, we will eventually come up with an explanation for it.)

Moreover, at this point we have a rough idea of what this sort of explanation should look like. We have experience playing with theories that don’t have any adjustable parameters, or that only have a few: M theory is an example, but there are also more traditional quantum field theories that fill this role with no mention of string theory. From our exploration of these theories, we know that they can serve as the kind of explanation we need: in a world governed by one of these theories, people unaware of the full theory would observe what would look at first glance like a world with many fiddly adjustable parameters, parameters that would eventually turn out to be consequences of the broader theory.

So for a physicist, fine-tuning is not about those fiddly parameters themselves. Rather, it’s about the theory that predicts them. Because we have experience playing with these sorts of theories, we know roughly the sorts of worlds they create. What we know is that, while sometimes they give rise to worlds that appear fine-tuned, they tend to only do so in particular ways. Setups that give rise to fine-tuning have consequences: supersymmetry, for example, can give rise to an apparently fine-tuned universe but has to have “partner” particles that show up in powerful enough colliders. In general, a theory that gives rise to apparent fine-tuning will have some detectable consequences.

That’s where physicists start to get worried. So far, we haven’t seen any of these detectable consequences, and it’s getting to the point where we could have, had they been the sort many people expected.

Physicists are worried about fine-tuning, but not because it makes the universe “unlikely”. They’re worried because the more finely-tuned our universe appears, the harder it is to find an explanation for it in terms of the sorts of theories we’re used to working with, and the less likely it becomes that someone will discover a good explanation any time soon. We’re quite confident that there should be some explanation, hundreds of years of scientific progress strongly suggest that to be the case. But the nature of that explanation is becoming increasingly opaque.

Why You Should Be Skeptical about Faster-than-Light Neutrinos

While I do love science, I don’t always love IFL Science. They can be good at drumming up enthusiasm, but they can also be ridiculously gullible. Case in point: last week, IFL Science ran a piece on a recent paper purporting to give evidence for faster-than-light particles.

Faster than light! Sounds cool, right? Here’s why you should be skeptical:

If a science article looks dubious, you should check out the source. In this case, IFL Science links to an article on the preprint server arXiv.

arXiv is a freely accessible website where physicists and mathematicians post their articles. The site has multiple categories, corresponding to different fields. It’s got categories for essentially any type of physics you’d care to include, with the option to cross-list if you think people from multiple areas might find your work interesting.

So which category is this paper in? Particle physics? Astrophysics?

General Physics, actually.

General Physics is arXiv’s catch-all category. Some of it really is general, and can’t be put into any more specific place. But most of it, including this, falls into another category: things arXiv’s moderators think are fishy.

arXiv isn’t a journal. If you follow some basic criteria, it won’t reject your articles. Instead, dubious articles are put into General Physics, to signify that they don’t seem to belong with the other scholarship in the established categories. General Physics is a grab-bag of weird ideas and crackpot theories, a mix of fringe physicists and overenthusiastic amateurs. There probably are legitimate papers in there too…but for every paper in there, you can guarantee that some experienced researcher found it suspicious enough to send into exile.

Even if you don’t trust the moderators of arXiv, there are other reasons to be wary of faster-than-light particles.

According to Einstein’s theory of relativity, massless particles travel at the speed of light, while massive particles always travel slower. To travel faster than the speed of light, you need to have a very unusual situation: a particle whose mass is an imaginary number.

Particles like that are called tachyons, and they’re a staple of science fiction. While there was a time when they were a serious subject of physics speculation, nowadays the general view is that tachyons are a sign we’re making bad assumptions.

Assuming that someone is a republic serial villain is a good example.

Why is that? It has to do with the nature of mass.

In quantum field theory, what we observe as particles arise as ripples in quantum fields, extending across space and time. The harder it is to make the field ripple, the higher the particle’s mass.

A tachyon has imaginary mass. This means that it isn’t hard to make the field ripple at all. In fact, exactly the opposite happens: it’s easier to ripple than to stay still! Any ripple, no matter how small, will keep growing until it’s not just a ripple, but a new default state for the field. Only when it becomes hard to change again will the changes stop. If it’s hard to change, though, then the particle has a normal, non-imaginary mass, and is no longer a tachyon!

Thus, the modern understanding is that if a theory has tachyons in it, it’s because we’re assuming that one of the quantum fields has the wrong default state. Switching to the correct default gets rid of the tachyons.

There are deeper problems with the idea proposed in this paper. Normally, the only types of fields that can have tachyons are scalars, fields that can be defined by a single number at each point, sort of like a temperature. The particles this article is describing aren’t scalars, though, they’re fermions, the type of particle that includes everyday matter like electrons. Those sorts of particles can’t be tachyons at all without breaking some fairly important laws of physics. (For a technical explanation of why this is, Lubos Motl’s reply to the post here is pretty good.)

Of course, this paper’s author knows all this. He’s well aware that he’s suggesting bending some fairly fundamental laws, and he seems to think there’s room for it. But that, really, is the issue here: there’s room for it. The paper isn’t, as IFL Science seems to believe, six pieces of evidence for faster-than-light particles. It’s six measurements that, if you twist them around and squint and pick exactly the right model, have room for faster-than-light particles. And that’s…probably not worth an article.

Why I Can’t Explain Ghosts: Or, a Review of a Popular Physics Piece

Since today is Halloween, I really wanted to write a post talking about the spookiest particles in physics, ghosts.

And their superpartners, ghost riders.

The problem is, in order to explain ghosts I’d have to explain something called gauge symmetry. And gauge symmetry is quite possibly the hardest topic in modern physics to explain to a general audience.

Deep down, gauge symmetry is the idea that irrelevant extra parts of how we represent things in physics should stay irrelevant. While that sounds obvious, it’s far from obvious how you can go from that to predicting new particles like the Higgs boson.

Explaining this is tough! Tough enough that I haven’t thought of a good way to do it yet.

Which is why I was fairly stoked when a fellow postdoc pointed out a recent popular physics article by Juan Maldacena, explaining gauge symmetry.

Juan Maldacena is a Big Deal. He’s the guy who figured out the AdS/CFT correspondence, showing that string theory (in a particular hyperbola-shaped space called AdS) and everybody’s favorite N=4 super Yang-Mills theory are secretly the same, a discovery which led to a Big Blue Dot on Paperscape. So naturally, I was excited to see what he had to say.

Big Blue Dot pictured here.

Big Blue Dot pictured here.

The core analogy he makes is with currencies in different countries. Just like gauge symmetry, currencies aren’t measuring anything “real”: they’re arbitrary conventions put in place because we don’t have a good way of just buying things based on pure “value”. However, also like gauge symmetry, then can have real-life consequences, as different currency exchange rates can lead to currency speculation, letting some people make money and others lose money. In Maldacena’s analogy the Higgs field works like a precious metal, making differences in exchange rates manifest as different prices of precious metals in different countries.

It’s a solid analogy, and one that is quite close to the real mathematics of the problem (as the paper’s Appendix goes into detail to show). However, I have some reservations, both about the paper as a whole and about the core analogy.

In general, Maldacena doesn’t do a very good job of writing something publicly accessible. There’s a lot of stilted, academic language, and a lot of use of “we” to do things other than lead the reader through a thought experiment. There’s also a sprinkling of terms that I don’t think the average person will understand; for example, I doubt the average college student knows flux as anything other than a zany card game.

Regarding the analogy itself, I think Maldacena has fallen into the common physicist trap of making an analogy that explains things really well…if you already know the math.

This is a problem I see pretty frequently. I keep picking on this article, and I apologize for doing so, but it’s got a great example of this when it describes supersymmetry as involving “a whole new class of number that can be thought of as the square roots of zero”. That’s a really great analogy…if you’re a student learning about the math behind supersymmetry. If you’re not, it doesn’t tell you anything about what supersymmetry does, or how it works, or why anyone might study it. It relates something unfamiliar to something unfamiliar.

I’m worried that Maldacena is doing that in this paper. His setup is mathematically rigorous, but doesn’t say much about the why of things: why do physicists use something like this economic model to understand these forces? How does this lead to what we observe around us in the real world? What’s actually going on, physically? What do particles have to do with dimensionless constants? (If you’re curious about that last one, I like to think I have a good explanation here.)

It’s not that Maldacena ignores these questions, he definitely puts effort into answering them. The problem is that his analogy itself doesn’t really address them. They’re the trickiest part, the part that people need help picturing and framing, the part that would benefit the most from a good analogy. Instead, the core imagery of the piece is wasted on details that don’t really do much for a non-expert.

Maybe I’m wrong about this, and I welcome comments from non-physicists. Do you feel like Maldacena’s account gives you a satisfying idea of what gauge symmetry is?

No, Hawking didn’t say that a particle collider could destroy the universe

So apparently Hawking says that the Higgs could destroy the universe.

HawkingHiggs

I’ve covered this already, right? No need to say anything more?

Ok, fine, I’ll write a real blog post.

The Higgs is a scalar field: a number, sort of like temperature, that can vary across space and time. In the case of the Higgs this number determines the mass of almost every fundamental particle (the jury is still somewhat out on neutrinos). The Higgs doesn’t vary much at all, in fact it takes an enormous (Large Hadron Collider-sized) amount of energy to get it to wobble even a little bit. That is because the Higgs is in a very very stable state.

Hawking was pointing out that, given our current model of the Higgs, there’s actually another possible state for the Higgs to be in, one that’s even more stable (because it takes less energy, essentially). In that state, the number the Higgs corresponds to is much larger, so everything would be much more massive, with potentially catastrophic results. (Matt Strassler goes into some detail about the assumptions behind this.)

For those who have been following my blog for a while, you may find these “stable states” familiar. They’re vacua, different possible ways to set up “empty” space. In that post, I may have given the impression that there’s no way to change from one stable state, one “vacuum”, to another. In the case of the Higgs, the state it’s in is so stable that vast amounts of energy (again, a Large Hadron Collider-worth) only serve to create a small, unstable fluctuation, the Higgs boson, which vanishes in a fraction of a second.

And that would be the full story, were it not for a curious phenomenon called quantum tunneling.

If you’ve heard someone else describe quantum tunneling, you’ve probably heard that quantum particles placed on one side of a wall have a very small chance of being found later on the other side of the wall, as if they had tunneled there.

Using their incredibly tiny shovels.

However, quantum tunneling applies to much more than just walls. In general, a particle in an otherwise stable state (whether stable because there are walls keeping it in place, or for other reasons) can tunnel into another state, provided that the new state is “more stable” (has lower energy).

The chance of doing this is small, and it gets smaller the more “stable” the particle’s initial state is. Still, if you apply that logic to the Higgs, you realize there’s a very very very small chance that one day the Higgs could just “tunnel” away from its current stable state, destroying the universe as we know it in the process.

If that happened, everything we know would vanish at the speed of light, and we wouldn’t see it coming.

While that may sound scary, it’s also absurdly unlikely, to the extent that it probably won’t happen until the universe is many times older than it is now. It’s not the sort of thing anybody should worry about, at least on a personal level.

Is Hawking fear-mongering, then, by pointing this out? Hardly. He’s just explaining science. Pointing out the possibility that the Higgs could spontaneously change and end the universe is a great way to emphasize the sheer scale of physics, and it’s pretty common for science communicators to mention it. I seem to recall a section about it in Particle Fever, and Sean Carroll even argues that it’s a good thing, due to killing off spooky Boltzmann Brains.

What do particle colliders have to do with all this? Well, apart from quantum tunneling, just inputting enough energy in the right way can cause a transition from one stable state to another. Here “enough energy” means about a million times that produced by the Large Hadron Collider. As Hawking jokes, you’d need a particle collider the size of the Earth to get this effect. I don’t know whether he actually ran the numbers, but if anything I’d guess that a Large Earth Collider would actually be insufficient.

Either way, Hawking is just doing standard science popularization, which isn’t exactly newsworthy. Once again, “interpret something Hawking said in the most ridiculous way possible” seems to be the du jour replacement for good science writing.

What’s an Amplitude? Just about everything.

I am an Amplitudeologist. In other words, I study scattering amplitudes. I’ve explained bits and pieces of what scattering amplitudes are in other posts, but I ought to give a short definition here so everyone’s on the same page:

A scattering amplitude is the formula used to calculate the probability that some collection of particles will “scatter”, emerging as some (possibly different) collection of particles.

Note that I’m using some weasel words here. The scattering amplitude is not a probability itself, but “the formula used to calculate the probability”. For those familiar with the mathematics of waves, the scattering amplitude gives the amplitude of a “probability wave” that must be squared to get the probability. (Those familiar with waves might also ask: “If this is the amplitude, what about the period?” The truth is that because scattering amplitudes are calculated using complex numbers, what we call the “amplitude” also contains information about the wave’s “period”. It may seem like an inconsistent way to name things from the perspective of a beginning student, but it is actually consistent with the terminology in a large chunk of physics.)

In some of the simplest scattering amplitudes particles literally “scatter”, with two particles “colliding” and emerging traveling in different directions.

A scattering amplitude can also describe a more complicated situation, though. At particle colliders like the Large Hadron Collider, two particles (a pair of protons for the LHC) are accelerated fast enough that when they collide they release a whole slew of new particles. Since it still fits the “some particles go in, some particles go out” template, this is still described by a scattering amplitude.

It goes even further than that, though, because “some particles” could also just be “one particle”. If you’re dealing with something unstable (the particle equivalent of radioactive, essentially) then one particle can decay into two or more particles. There’s a whole slew of questions that require that sort of calculation. For example, if unstable particles were produced in the early universe, how many of them would be left around today? If dark matter is unstable (and some possible candidates are), when it decays it might release particles we could detect. In general, this sort of scattering amplitude is often of interest to astrophysicists when they happen to get involved in particle physics.

You can even use scattering amplitudes to describe situations that, at first glance, don’t sound like collisions of particles at all. If you want to find the effect of a magnetic field on an electron to high accuracy, the calculation also involves a scattering amplitude. A magnetic field can be thought of in terms of photons, particles of light, because light is a vibration in the electro-magnetic field. This means that the effect of a magnetic field on an electron can be calculated by “scattering” an electron and a photon.

4gravanom

If this looks familiar, check the handbook section.

In fact, doing the calculation in this way leads to what is possibly the most accurately predicted number in all of science.

Scattering amplitudes show up all over the place, from particle physics at the Large Hadron Collider to astrophysics to delicate experiments on electrons in magnetic fields. That said, there are plenty of things people calculate in theoretical physics that don’t use scattering amplitudes, either because they involve questions that are difficult to answer from the scattering amplitude point of view, or because they invoke different formulas altogether. Still, scattering amplitudes are central to the work of a large number of physicists. They really do cover just about everything.

Am I a String Theorist?

Perimeter, like most institutes of theoretical physics, divides their researchers into semi-informal groups. At Perimeter, these are:

  • Condensed Matter
  • Cosmology
  • Mathematical Physics
  • Particle Physics
  • Quantum Fields and Strings
  • Quantum Foundations
  • Quantum Gravity
  • Quantum Information
  • Strong Gravity

I’m in the Quantum Fields and Strings group, which many people seem to refer to simply as the String Theory group. So for the past week or so, I’ve been introducing myself as a String Theorist. As I briefly mention in my Who Am I? post, this isn’t completely accurate.

Am I a String Theorist?

The theories that I study do derive from string theory. They were first framed by string theorists, and research into them is still deeply intertwined with string theory research. I’ve definitely had occasion to compare my results to those of string theorists, or to bring in calculations by string theorists to advance my work.

And if you’re the kind of person who views the world as a competition between string theory and its rivals (like Loop Quantum Gravity) then I suppose I’m on the string theory “side”. I’m optimistic, at least, that the reason why string theory research is so much more common than any other approach to quantum gravity is simply because string theory provides many more interesting and viable projects for researchers.

On the other hand, though, there’s the basic fact that the theories I work with are not, themselves, string theories. They’re quantum field theories, the broader class that encompasses the modern synthesis of quantum mechanics and special relativity. The theories I work with are often reasonably close to the well-tested theories of the real world, close enough that the calculations are more “particle physics” than the they are “string theory”.

Of course, all of that could change. One of the great things about string theory is the way it connects lots of different interesting quantum field theories together. There’s a “string”, the “GKP string”, involved in the work of Basso, Sever, and Vieira, work that I will probably get involved with here at Perimeter. The (2,0) theory is a quantum field theory, but it’s much closer to string theory than to particle physics, so if I get more involved with the (2,0) theory would that make me a string theorist?

The fact is, these days string theory is so ubiquitous that the question “Am I a String Theorist?” doesn’t actually mean anything. String theory is there, lurking in the background, able to get involved at any time even if it’s not directly involved at present. Theoretical physicists don’t fall into neat categories.

I am a String Theorist. Also, I am not.