Tag Archives: particle physics

Things You Don’t Know about the Power of the Dark Side

Last Wednesday, Katherine Freese gave a Public Lecture at Perimeter on the topic of Dark Matter and Dark Energy. The talk should be on Perimeter’s YouTube page by the time this post is up.

Answering twitter questions during the talk made me realize that there’s a lot the average person finds confusing about Dark Matter and Dark Energy. Freese addressed much of this pretty well in her talk, but I felt like there was room for improvement. Rather than try to tackle it myself, I decided to interview an expert on the Dark Side of the universe.

darth_vader

Twitter doesn’t know the power of the dark side!

Lord Vader, some people have a hard time distinguishing Dark Matter and Dark Energy. What do you have to say to them?

Fools! Light side astronomers call “dark” that which they cannot observe and cannot understand. “Fear” and “anger” are different heights of emotion, but to the Jedi they are only the path to the Dark Side. Dark Energy and Dark Matter are much the same: both distinct, both essential to the universe, and both “dark” to the telescopes of the light.

Let’s start with Dark Matter. Is it really matter?

You ask an empty question. “Matter” has been defined in many ways. When we on the Dark Side refer to Dark Matter, we merely mean to state that it behaves much like the matter you know: it is drawn to and fro by gravity, sloshing about.

It is distinct from your ordinary matter in that two of the forces of nature, the strong nuclear force and electromagnetism, do not concern it. Ordinary matter is bound together in the nuclei of atoms by the strong force, or woven into atoms and molecules by electromagnetism. This makes it subject to all manner of messy collisions.

Dark Matter, in contrast, is pure, partaking neither of nuclear nor chemical reactions. It passes through each of us with no notice. Only the weak nuclear force and gravity affect it. The latter has brought it slowly into clumps and threads through the universe, each one a vast nest for groupings of stars. Truly, Dark Matter surrounds us, penetrates us, and binds the galaxy together.

Could Dark Matter be something we’re more familiar with, like neutrinos or black holes? What about a modification of gravity?

Many wondered as much, when the study of the Dark Side was young. They were wrong.

The matter you are accustomed to composes merely a twentieth of the universe, while Dark Matter is more than a quarter. There is simply not enough of these minor contributions, neutrinos and black holes, to account for the vast darkness that surrounds the galaxy, and with each astronomer’s investigation we grow more assured.

As for modifying gravity, do you seek to modify a fundamental Force?

If so, you should be wary. Forces, by their nature, are accompanied by particles, and gravity is no exception. Take care that your tinkering does not result in a new sort of particle. If so, you may be unknowingly walking the path of the Dark Side, for your modification may be just another form of Dark Matter.

What sort of things could Dark Matter be? Can Dark Matter decay into ordinary matter? Could there be anti-Dark Matter?

As of yet, your scientists are still baffled by the nature of Dark Matter. Still, there are limits. Since only rare events could produce it from ordinary matter, the universe’s supply of Dark Matter must be ancient, dating back to the dawn of the cosmos. In that case, it must decay only slowly, if at all. Similarly, if Dark Matter had antimatter forms then its interactions must be so weak that it has not simply annihilated with its antimatter half across the universe. So while either is possible, it may be simpler for your theorists if Dark Matter did not decay, and was its own antimatter counterpart. On the other hand, if Dark Matter did undergo such reactions, your kind may one day be able to detect it.

Of course, as a master of the Dark Side I know the true nature of Dark Matter. However, I could only impart it to a loyal apprentice…

Yeah, I think I’ll pass on that. They say you can only get a job in academia when someone dies, but unlike the Sith they don’t mean it literally.

Let’s move on to Dark Energy. What can you tell us about it?

Dark “Energy”, like Dark Matter, is named for what people on your Earth cannot comprehend. Nothing, not even Dark Energy, is “made of energy”. Dark Energy is “energy” merely because it behaves unlike matter.

Matter, even Dark Matter, is drawn together by the force of gravity. Under its yoke, the universe would slow down in its expansion and eventually collapse into a crunch, like the throat of an incompetent officer.

However, the universe is not collapsing, but accelerating, galaxies torn away from each other by a force that must compose more than two thirds of the universe. It is rather like the Yuuzhan Vong, a mysterious force from outside the galaxy that scouts persistently under- or over-estimate.

Umm, I’m pretty sure the Yuuzhan Vong don’t exist anymore, since Disney got rid of the Expanded Universe.

That perfidious Mouse!

Well folks, Vader is now on a rampage of revenge in the Disney offices, so I guess we’ll have to end the interview. Tune in next week, and until then, may the Force be with you!

The Higgs Solution

My grandfather is a molecular biologist. Over the holidays I had many opportunities to chat with him, and our conversations often revolved around explaining some aspect of our respective fields. While talking to him, I came up with a chemistry-themed description of the Higgs field, and how it leads to electro-weak symmetry breaking. Very few of you are likely to be chemists, but I think you still might find the metaphor worthwhile.

Picture the Higgs as a mixture of ions, dissolved in water.

In this metaphor, the Higgs field is a sort of “Higgs solution”. Overall, this solution should be uniform: if you have more ions of a certain type in one place than another, over time they will dissolve until they reach a uniform mixture again. In this metaphor, the Higgs particle detected by the LHC is like a brief disturbance in the fluid: by stirring the solution at high energy, we’ve managed to briefly get more of one type of ion in one place than the average concentration.

What determines the average concentration, though?

Essentially, it’s arbitrary. If this were really a chemistry experiment, it would depend on the initial conditions: which ions we put in to the mixture in the first place. In physics, quantum mechanics plays a role, randomly selecting one option out of the many possibilities.

 

nile_red_01

Choose wisely

(Note that this metaphor doesn’t explain why there has to be a solution, why the water can’t just be “pure”. A setup that required this would probably be chemically complicated enough to confuse nearly everybody, so I’m leaving that feature out. Just trust that “no ions” isn’t one of our options.)

Up till now, the choice of mixture didn’t matter very much. But different ions interact with other chemicals in different ways, and this has some interesting implications.

Suppose we have a tube filled with our Higgs solution. We want to shoot some substance through the tube, and collect it on the other side. This other substance is going to represent a force.

If our force substance doesn’t react with the ions in our Higgs solution, it will just go through to the other side. If it does react, though, then it will be slowed down, and only some of it will get to the other side, possibly none at all.

You can think of the electro-weak force as a mixture of these sorts of substances. Normally, there is no way to tell the different substances apart. Just like the different Higgs solutions, different parts of the electro-weak force are arbitrary.

However, once we’ve chosen a Higgs solution, things change. Now, different parts of our electro-weak substance will behave differently. The parts that react with the ions in our Higgs solution will slow down, and won’t make it through the tube, while the parts that don’t interact will just flow on through.

We call the part that gets through the tube electromagnetism, and the part that doesn’t the weak nuclear force. Electromagnetism is long-range, its waves (light) can travel great distances. The weak nuclear force is short-range, and doesn’t have an effect outside of the scale of atoms.

The important thing to take away from this is that the division between electromagnetism and the weak nuclear force is totally arbitrary. Taken by themselves, they’re equivalent parts of the same, electro-weak force. It’s only because some of them interact with the Higgs, while others don’t, that we distinguish those parts from each other. If the Higgs solution were a different mixture (if the Higgs field had different charges) then a different part of the electroweak force would be long-range, and a different part would be short-range.

We wouldn’t be able to tell the difference, though. We’d see a long-range force, and a short-range force, and a Higgs field. In the end, our world would be completely the same, just based on a different, arbitrary choice.

Is Everything Really Astonishingly Simple?

Neil Turok gave a talk last week, entitled The Astonishing Simplicity of Everything. In it, he argued that our current understanding of physics is really quite astonishingly simple, and that recent discoveries seem to be confirming this simplicity.

For the right sort of person, this can be a very uplifting message. The audience was spellbound. But a few of my friends were pretty thoroughly annoyed, so I thought I’d dedicate a post to explaining why.

Neil’s talk built up to showing this graphic, one of the masterpieces of Perimeter’s publications department:

Looked at in this way, the laws of physics look astonishingly simple. One equation, a few terms, each handily labeled with a famous name of some (occasionally a little hazy) relevance to the symbol in question.

In a sense, the world really is that simple. There are only a few kinds of laws that govern the universe, and the concepts behind them are really, deep down, very simple concepts. Neil adroitly explained some of the concepts behind quantum mechanics in his talk (here represented by the Schrodinger, Feynman, and Planck parts of the equation), and I have a certain fondness for the Maxwell-Yang-Mills part. The other parts represent different kinds of particles, and different ways they can interact.

While there are only a few different kinds of laws, though, that doesn’t mean the existing laws are simple. That nice, elegant equation hides 25 arbitrary parameters, hidden in the Maxwell-Yang-Mills, Dirac, Kobayashi-Masakawa, and Higgs parts. It also omits the cosmological constant, which fuels the expansion of the universe. And there are problems if you try to claim that the gravity part, for example, is complete.

When Neil mentions recent discoveries, he’s referring to the LHC not seeing new supersymmetric particles, to telescopes not seeing any unusual features in the cosmic microwave background. The theories that were being tested, supersymmetry and inflation, are in many ways more complicated than the Standard Model, adding new parameters without getting rid of old ones. But I think it’s a mistake to say that if these theories are ruled out, the world is astonishingly simple. These theories are attempts to explain unlikely features of the old parameters, or unlikely features of the universe we observe. Without them, we’ve still got those unlikely, awkward, complicated bits.

Of course, Neil doesn’t think the Standard Model is all there is either, and while he’s not a fan of inflation, he does have proposals he’s worked on that explain the same observations, proposals that are also beyond the current picture. More broadly, he’s not suggesting here that the universe is just what we’ve figured out so far and no more. Rather, he’s suggesting that new proposals ought to build on the astonishing simplicity of the universe, instead of adding complexity, that we need to go back to the conceptual drawing board rather than correcting the universe with more gears and wheels.

On the one hand, that’s Perimeter’s mission statement in a nutshell. Perimeter’s independent nature means that folks here can focus on deeper conceptual modifications to the laws of physics, rather than playing with the sorts of gears and wheels that people already know how to work with.

On the other hand, a lack of new evidence doesn’t do anyone any favors. It doesn’t show the way for supersymmetry, but it doesn’t point to any of the “deep conceptual” approaches either. And so for some people, Neil’s glee at the lack of new evidence feels less like admiration for the simplicity of the cosmos and more like that one guy in a group project who sits back chuckling while everyone else fails. You can perhaps understand why some people felt resentful.

Hooray for Neutrinos!

Congratulations to Takaaki Kajita and Arthur McDonald, winners of this year’s Nobel Prize in Physics, as well as to the Super-Kamiokande and SNOLAB teams that made their work possible.

Congratulations!

Unlike last year’s Nobel, this is one I’ve been anticipating for quite some time. Kajita and McDonald discovered that neutrinos have mass, and that discovery remains our best hint that there is something out there beyond the Standard Model.

But I’m getting a bit ahead of myself.

Neutrinos are the lightest of the fundamental particles, and for a long time they were thought to be completely massless. Their name means “little neutral one”, and it’s probably the last time physicists used “-ino” to mean “little”. Neutrinos are “neutral” because they have no electrical charge. They also don’t interact with the strong nuclear force. Only the weak nuclear force has any effect on them. (Well, gravity does too, but very weakly.)

This makes it very difficult to detect neutrinos: you have to catch them interacting via the weak force, which is, well, weak. Originally, that meant they had to be inferred by their absence: missing energy in nuclear reactions carried away by “something”. Now, they can be detected, but it requires massive tanks of fluid, carefully watched for the telltale light of the rare interactions between neutrinos and ordinary matter. You wouldn’t notice if billions of neutrinos passed through you every second, like an unstoppable army of ghosts. And in fact, that’s exactly what happens!

Visualization of neutrinos from a popular documentary

In the 60’s, scientists began to use these giant tanks of fluid to detect neutrinos coming from the sun. An enormous amount of effort goes in to understanding the sun, and these days our models of it are pretty accurate, so it came as quite a shock when researchers observed only half the neutrinos they expected. It wasn’t until the work of Super-Kamiokande in 1998, and SNOLAB in 2001, that we knew the reason why.

As it turns out, neutrinos oscillate. Neutrinos are produced in what are called flavor states, which match up with the different types of leptons. There are electron-neutrinos, muon-neutrinos, and tau-neutrinos.

Radioactive processes usually produce electron-neutrinos, so those are the type that the sun produces. But on their way from the sun to the earth, these neutrinos “oscillate”: they switch between electron neutrinos and the other types! The older detectors, focused only on electron-neutrinos, couldn’t see this. SNOLAB’s big advantage was that it could detect the other types of neutrinos as well, and tell the difference between them, which allowed it to see that the “missing” neutrinos were really just turning into other flavors! Meanwhile, Super-Kamiokande measured neutrinos coming not from the sun, but from cosmic rays reacting with the upper atmosphere. Some of these neutrinos came from the sky above the detector, while others traveled all the way through the earth below it, from the atmosphere on the other side. By observing “missing” neutrinos coming from below but not from above, Super-Kamiokande confirmed that it wasn’t the sun’s fault that we were missing solar neutrinos, neutrinos just oscillate!

What does this oscillation have to do with neutrinos having mass, though?

Here things get a bit trickier. I’ve laid some of the groundwork in older posts. I’ve told you to think about mass as “energy we haven’t met yet”, as the energy something has when we leave it alone to itself. I’ve also mentioned that conservation laws come from symmetries of nature, that energy conservation is a result of symmetry in time.

This should make it a little more plausible when I say that when something has a specific mass, it doesn’t change. It can decay into other particles, or interact with other forces, but left alone, by itself, it won’t turn into something else. To be more specific, it doesn’t oscillate. A state with a fixed mass is symmetric in time.

The only way neutrinos can oscillate between flavor states, then, is if one flavor state is actually a combination (in quantum terms, a superposition) of different masses. The components with different masses move at different speeds, so at any point along their path you can be more or less likely to see certain masses of neutrinos. As the mix of masses changes, the flavor state changes, so neutrinos end up oscillating from electron-neutrino, to muon-neutrino, to tau-neutrino.

So because of neutrino oscillation, neutrinos have to have mass. But this presented a problem. Most fundamental particles get their mass from interacting with the Higgs field. But, as it turns out, neutrinos can’t interact with the Higgs field. This has to do with the fact that neutrinos are “chiral”, and only come in a “left-handed” orientation. Only if they had both types of “handedness” could they get their mass from the Higgs.

As-is, they have to get their mass another way, and that way has yet to be definitively shown. Whatever it ends up being, it will be beyond the current Standard Model. Maybe there actually are right-handed neutrinos, but they’re too massive, or interact too weakly, for them to have been discovered. Maybe neutrinos are Majorana particles, getting mass in a novel way that hasn’t been seen yet in the Standard Model.

Whatever we discover, neutrinos are currently our best evidence that something lies beyond the Standard Model. Naturalness may have philosophical problems, dark matter may be explained away by modified gravity…but if neutrinos have mass, there’s something we still have yet to discover. And that definitely seems worthy of a Nobel to me!

Pentaquarks!

Earlier this week, the LHCb experiment at the Large Hadron Collider announced that, after painstakingly analyzing the data from earlier runs, they have decisive evidence of a previously unobserved particle: the pentaquark.

What’s a pentaquark? In simple terms, it’s five quarks stuck together. Stick two up quarks and a down quark together, and you get a proton. Stick two quarks together, you get a meson of some sort. Five, you get a pentaquark.

(In this case, if you’re curious: two up quarks, one down quark, one charm quark and one anti-charm quark.)

Artist’s Conception

Crucially, this means pentaquarks are not fundamental particles. Fundamental particles aren’t like species, but composite particles like pentaquarks are: they’re examples of a dizzying variety of combinations of an already-known set of basic building blocks.

So why is this discovery exciting? If we already knew that quarks existed, and we already knew the forces between them, shouldn’t we already know all about pentaquarks?

Well, not really. People definitely expected pentaquarks to exist, they were predicted fifty years ago. But their exact properties, or how likely they were to show up? Largely unknown.

Quantum field theory is hard, and this is especially true of QCD, the theory of quarks and gluons. We know the basic rules, but calculating their large-scale consequences, which composite particles we’re going to detect and which we won’t, is still largely out of our reach. We have to supplement first-principles calculations with experimental data, to take bits and pieces and approximations until we get something reasonably sensible.

This is an important point in general, not just for pentaquarks. Often, people get very excited about the idea of a “theory of everything”. At best, such a theory would tell us the fundamental rules that govern the universe. The thing is, we already know many of these rules, even if we don’t yet know all of them. What we can’t do, in general, is predict their full consequences. Most of physics, most of science in general, is about investigating these consequences, coming up with models for things we can’t dream of calculating from first principles, and it really does start as early as “what composite particles can you make out of quarks?”

Pentaquarks have been a long time coming, long enough that someone occasionally proposed a model that explained that they didn’t exist. There are still other exotic states of quarks and gluons out there, like glueballs, that have been predicted but not yet observed. It’s going to take time, effort, and data before we fully understand composite particles, even though we know the rules of QCD.

What’s the Matter with Dark Matter, Matt?

It’s very rare that I disagree with Matt Strassler. That said, I can’t help but think that, when he criticizes the press for focusing their LHC stories on dark matter, he’s missing an important element.

From his perspective, when the media says that the goal of the new run of the LHC is to detect dark matter, they’re just being lazy. People have heard of dark matter. They might have read that it makes up 23% of the universe, more than regular matter at 4%. So when an LHC physicist wants to explain what they’re working on to a journalist, the easiest way is to talk about dark matter. And when the journalist wants to explain the LHC to the public, they do the same thing.

This explanation makes sense, but it’s a little glib. What Matt Strassler is missing is that, from the public’s perspective, dark matter really is a central part of the LHC’s justification.

Now, I’m not saying that the LHC’s main goal is to detect dark matter! Directly detecting dark matter is pretty low on the LHC’s list of priorities. Even if it detects a new particle with the right properties to be dark matter, it still wouldn’t be able to confirm that it really is dark matter without help from another experiment that actually observes some consequence of the new particle among the stars. I agree with Matt when he writes that the LHC’s priorities for the next run are

  1. studying the newly discovered Higgs particle in great detail, checking its properties very carefully against the predictions of the “Standard Model” (the equations that describe the known apparently-elementary particles and forces)  to see whether our current understanding of the Higgs field is complete and correct, and

  2. trying to find particles or other phenomena that might resolve the naturalness puzzle of the Standard Model, a puzzle which makes many particle physicists suspicious that we are missing an important part of the story, and

  3. seeking either dark matter particles or particles that may be shown someday to be “associated” with dark matter.

Here’s the thing, though:

From the public’s perspective, why do we need to study the properties of the Higgs? Because we think it might be different than the Standard Model predicts.

Why do we think it might be different than the Standard Model predicts? More generally, why do we expect the world to be different from the Standard Model at all? Well there are a few reasons, but they generally boil down to two things: the naturalness puzzle, and the fact that the Standard Model doesn’t have anything that could account for dark matter.

Naturalness is a powerful motivation, but it’s hard to sell to the general public. Does the universe appear fine-tuned? Then maybe it just is fine-tuned! Maybe someone fine-tuned it!

These arguments miss the real problem with fine-tuning, but they’re hard to correct in a short article. Getting the public worried about naturalness is tough, tough enough that I don’t think we can demand it of the average journalist, or accuse them of being lazy if they fail to do it.

That leaves dark matter. And for all that naturalness is philosophically murky, dark matter is remarkably clear. We don’t know what 96% of the universe is made of! That’s huge, and not just in a “gee-whiz-cool” way. It shows, directly and intuitively, that physics still has something it needs to solve, that we still have particles to find. Unless you are a fan of (increasingly dubious) modifications to gravity like MOND, dark matter is the strongest possible justification for machines like the LHC.

The LHC won’t confirm dark matter on its own. It might not directly detect it, that’s still quite up-in-the-air. And even if it finds deviations from the Standard Model, it’s not likely they’ll be directly caused by dark matter, at least not in a simple way.

But the reason that the press is describing the LHC’s mission in terms of dark matter isn’t just laziness. It’s because, from the public’s perspective, dark matter is the only vaguely plausible reason to spend billions of dollars searching for new particles, especially when we’ve already found the Higgs. We’re lucky it’s such a good reason.

Want to Make Something New? Just Turn on the Lights.

Isn’t it weird that you can collide two protons, and get something else?

It wouldn’t be so weird if you collided two protons, and out popped a quark. After all, protons are made of quarks. But how, if you collide two protons together, do you get a tau, or the Higgs boson: things that not only aren’t “part of” protons, but are more massive than a proton by themselves?

It seems weird…but in a way, it’s not. When a particle releases another particle that wasn’t inside it to begin with, it’s actually not doing anything more special than an everyday light bulb.

Eureka!

How does a light bulb work?

You probably know the basics: when an electrical current enters the bulb, the electrons in the filament start to move. They heat the filament up, releasing light.

That probably seems perfectly ordinary. But ask yourself for a moment: where did the light come from?

Light is made up of photons, elementary particles in their own right. When you flip a light switch, where do the photons come from? Were they stored in the light bulb?

Silly question, right? You don’t need to “store” light in a light bulb: light bulbs transform one type of energy (electrical, or the movement of electrons) into another type of energy (light, or photons).

Here’s the thing, though: mass is just another type of energy.

I like to describe mass as “energy we haven’t met yet”. Einstein’s equation, E=mc^2, relates a particle’s mass to its “rest energy”, the energy it would have if it stopped moving around and sit still. Even when a particle seems to be sitting still from the outside, there’s still a lot going on, though. “Composite” particles like protons have powerful forces between their internal quarks, while particles like electrons interact with the Higgs field. These processes give the particle energy, even when it’s not moving, so from our perspective on the outside they’re giving the particle mass.

What does that mean for the protons at the LHC?

The protons at the LHC have a lot of kinetic energy: they’re going 99.9999991% of the speed of light! When they collide, all that energy has to go somewhere. Just like in a light bulb, the fast-moving particles will release their energy in another form. And while that some of that energy will add to the speed of the fragments, much of it will go into the mass and energy of new particles. Some of these particles will be photons, some will be tau leptons, or Higgs bosons…pretty much anything that the protons have enough energy to create.

So if you want to understand how to create new particles, you don’t need a deep understanding of the mysteries of quantum field theory. Just turn on the lights.

How to Predict the Mass of the Higgs

Did Homer Simpson predict the mass of the Higgs boson?

No, of course not.

Apart from the usual reasons, he’s off by more than a factor of six.

If you play with the numbers, it looks like Simon Singh (the popular science writer who reported the “discovery” Homer made as a throwaway joke in a 1998 Simpsons episode) made the classic physics mistake of losing track of a factor of 2\pi. In particular, it looks like he mistakenly thought that the Planck constant, h, was equal to the reduced Planck constant, \hbar, divided by 2\pi, when actually it’s \hbar times 2\pi. So while Singh read Homer’s prediction as 123 GeV, surprisingly close to the actual Higgs mass of 125 GeV found in 2012, in fact Homer predicted the somewhat more embarrassing value of 775 GeV.

D’Oh!

That was boring. Let’s ask a more interesting question.

Did Gordon Kane predict the mass of the Higgs boson?

I’ve talked before about how it seems impossible that string theory will ever make any testable predictions. The issue boils down to one of too many possibilities: string theory predicts different consequences for different ways that its six (or seven for M theory) extra dimensions can be curled up. Since there is an absurdly vast number of ways this can be done, anything you might want to predict (say, the mass of the electron) has an absurd number of possible values.

Gordon Kane and collaborators get around this problem by tackling a different one. Instead of trying to use string theory to predict things we already know, like the mass of the electron, they assume these things are already true. That is, they assume we live in a world with electrons that have the mass they really have, and quarks that have the mass they really have, and so on. They assume that we live in a world that obeys all of the discoveries we’ve already made, and a few we hope to make. And, they assume that this world is a consequence of string (or rather M) theory.

From that combination of assumptions, they then figure out the consequences for things that aren’t yet known. And in a 2011 paper, they predicted the Higgs mass would be between 105 and 129 GeV.

I have a lot of sympathy for this approach, because it’s essentially the same thing that non-string-theorists do. When a particle physicist wants to predict what will come out of the LHC, they don’t try to get it from first principles: they assume the world works as we have discovered, make a few mild extra assumptions, and see what new consequences come out that we haven’t observed yet. If those particle physicists can be said to make predictions from supersymmetry, or (shudder) technicolor, then Gordon Kane is certainly making predictions from string theory.

So why haven’t you heard of him? Even if you have, why, if this guy successfully predicted the mass of the Higgs boson, are people still saying that you can’t make predictions with string theory?

Trouble is, making predictions is tricky.

Part of the problem is timing. Gordon Kane’s paper went online in December of 2011. The Higgs mass was announced in July 2012, so you might think Kane got a six month head-start. But when something is announced isn’t the same as when it’s discovered. For a big experiment like the Large Hadron Collider, there’s a long road between the first time something gets noticed and the point where everyone is certain enough that they’re ready to announce it to the world. Rumors fly, and it’s not clear that Kane and his co-authors wouldn’t have heard them.

Assumptions are the other issue. Remember when I said, a couple paragraphs up, that Kane’s group assumed “that we live in a world that obeys all of the discoveries we’ve already made, and a few we hope to make“? That last part is what makes things tricky. There were a few extra assumptions Kane made, beyond those needed to reproduce the world we know. For many people, some of these extra assumptions are suspicious. They worry that the assumptions might have been chosen, not just because they made sense, but because they happened to give the right (rumored) mass of the Higgs.

If you want to predict something in physics, it’s not just a matter of getting in ahead of the announcement with the right number. For a clear prediction, you need to be early enough that the experiments haven’t yet even seen hints of what you’re looking for. Even then, you need your theory to be suitably generic, so that it’s clear that your prediction is really the result of the math and not of your choices. You can trade off aspects of this: more accuracy for a less generic theory, better timing for looser predictions. Get the formula right, and the world will laud you for your prediction. Wrong, and you’re Homer Simpson. Somewhere in between, though, and you end up in that tricky, tricky grey area.

Like Gordon Kane.

Living in a Broken World: Supersymmetry We Can Test

I’ve talked before about supersymmetry. Supersymmetry relates particles with different spins, linking spin 1 force-carrying particles like photons and gluons to spin 1/2 particles similar to electrons, and spin 1/2 particles in turn to spin 0 “scalar” particles, the same general type as the Higgs. I emphasized there that, if two particles are related by supersymmetry, they will have some important traits in common: the same mass and the same interactions.

That’s true for the theories I like to work with. In particular, it’s true for N=4 super Yang-Mills. Adding supersymmetry allows us to tinker with neater, cleaner theories, gaining mastery over rice before we start experimenting with the more intricate “sushi” of theories of the real world.

However, it should be pretty clear that we don’t live in a world with this sort of supersymmetry. A quick look at the Standard Model indicates that no two known particles interact in precisely the same way. When people try to test supersymmetry in the real world, they’re not looking for this sort of thing. Rather, they’re looking for broken supersymmetry.

In the past, I’ve described broken supersymmetry as like a broken mirror: the two sides are no longer the same, but you can still predict one side’s behavior from the other. When supersymmetry is broken, related particles still have the same interactions. Now, though, they can have different masses.

The simplest version of supersymmetry, N=1, gives one partner to each particle. Since nothing in the Standard Model can be partners of each other, if we have broken N=1 supersymmetry in the real world then we need a new particle for each existing one…and each one of those particles has a potentially unknown, different mass. And if that sounds rather complicated…

Baroque enough to make Rubens happy.

That, right there, is the Minimal Supersymmetric Standard Model, the simplest thing you can propose if you want a world with broken supersymmetry. If you look carefully, you’ll notice that it’s actually a bit more complicated than just one partner for each known particle: there are a few extra Higgs fields as well!

If we’re hoping to explain anything in a simpler way, we seem to have royally screwed up. Luckily, though, the situation is not quite as ridiculous as it appears. Let’s go back to the mirror analogy.

If you look into a broken mirror, you can still have a pretty good idea of what you’ll see…but in order to do so, you have to know how the mirror is broken.

Similarly, supersymmetry can be broken in different ways, by different supersymmetry-breaking mechanisms.

The general idea is to start with a theory in which supersymmetry is precisely true, and all supersymmetric partners have the same mass. Then, consider some Higgs-like field. Like the Higgs, it can take some constant value throughout all of space, forming a background like the color of a piece of construction paper. While the rules that govern this field would respect supersymmetry, any specific value it takes wouldn’t. Instead, it would be biased: the spin 0, Higgs-like field could take on a constant value, but its spin 1/2 supersymmetric partner couldn’t. (If you want to know why, read my post on the Higgs linked above.)

Once that field takes on a specific value, supersymmetry is broken. That breaking then has to be communicated to the rest of the theory, via interactions between different particles. There are several different ways this can work: perhaps the interactions come from gravity, or are the same strength as gravity. Maybe instead they come from a new fundamental force, similar to the strong nuclear force but harder to discover. They could even come as byproducts of the breaking of other symmetries.

Each one of these options has different consequences, and leads to different predictions for the masses of undiscovered partner particles. They tend to have different numbers of extra parameters (for example, if gravity-based interactions are involved there are four new parameters, and an extra sign, that must be fixed). None of them have an entire standard model-worth of new parameters…but all of them have at least a few extra.

(Brief aside: I’ve been talking about the Minimal Supersymmetric Standard Model, but these days people have largely given up on finding evidence for it, and are exploring even more complicated setups like the Next-to-Minimal Supersymmetric Standard Model.)

If we’re introducing extra parameters without explaining existing ones, what’s the point of supersymmetry?

Last week, I talked about the problem of fine-tuning. I explained that when physicists are worried about fine-tuning, what we’re really worried about is whether the sorts of ultimate (low number of parameters) theories that we expect to hold could give rise to the apparently fine-tuned world we live in. In that post, I was a little misleading about supersymmetry’s role in that problem.

The goal of introducing (broken) supersymmetry is to solve a particular set of fine-tuning problems, mostly one specific one involving the Higgs. This doesn’t mean that supersymmetry is the sort of “ultimate” theory we’re looking for, rather supersymmetry is one of the few ways we know to bridge the gap between “ultimate” theories and a fine-tuned real world.

To explain it in terms of the language of the last post, it’s hard to find one of these “ultimate” theories that gives rise to a fine-tuned world. What’s quite a bit easier, though, is finding one of these “ultimate” theories that gives rise to a supersymmetric world, which in turn gives rise to a fine-tuned real world.

In practice, these are the sorts of theories that get tested. Very rarely are people able to propose testable versions of the more “ultimate” theories. Instead, one generally finds intermediate theories, theories that can potentially come from “ultimate” theories, and builds general versions of those that can be tested.

These intermediate theories come in multiple levels. Some physicists look for the most general version, theories like the Minimal Supersymmetric Standard Model with a whole host of new parameters. Others look for more specific versions, choices of supersymmetry-breaking mechanisms. Still others try to tie it further up, getting close to candidate “ultimate” theories like M theory (though in practice they generally make a few choices that put them somewhere in between).

The hope is that with a lot of people covering different angles, we’ll be able to make the best use of any new evidence that comes in. If “something” is out there, there are still a lot of choices for what that something could be, and it’s the job of physicists to try to understand whatever ends up being found.

Not bad for working in a broken world, huh?

The Real Problem with Fine-Tuning

You’ve probably heard it said that the universe is fine-tuned.

The Standard Model, our current best understanding of the rules that govern particle physics, is full of lots of fiddly adjustable parameters. The masses of fundamental particles and the strengths of the fundamental forces aren’t the sort of thing we can predict from first principles: we need to go out, do experiments, and find out what they are. And you’ve probably heard it argued that, if these fiddly parameters were even a little different from what they are, life as we know it could not exist.

That’s fine-tuning…or at least, that’s what many people mean when they talk about fine-tuning. It’s not exactly what physicists mean though. The thing is, almost nobody who studies particle physics thinks the parameters of the Standard Model are the full story. In fact, any theory with adjustable parameters probably isn’t the full story.

It all goes back to a point I made a while back: nature abhors a constant. The whole purpose of physics is to explain the natural world, and we have a long history of taking things that look arbitrary and linking them together, showing that reality has fewer parameters than we had thought. This is something physics is very good at. (To indulge in a little extremely amateurish philosophy, it seems to me that this is simply an inherent part of how we understand the world: if we encounter a parameter, we will eventually come up with an explanation for it.)

Moreover, at this point we have a rough idea of what this sort of explanation should look like. We have experience playing with theories that don’t have any adjustable parameters, or that only have a few: M theory is an example, but there are also more traditional quantum field theories that fill this role with no mention of string theory. From our exploration of these theories, we know that they can serve as the kind of explanation we need: in a world governed by one of these theories, people unaware of the full theory would observe what would look at first glance like a world with many fiddly adjustable parameters, parameters that would eventually turn out to be consequences of the broader theory.

So for a physicist, fine-tuning is not about those fiddly parameters themselves. Rather, it’s about the theory that predicts them. Because we have experience playing with these sorts of theories, we know roughly the sorts of worlds they create. What we know is that, while sometimes they give rise to worlds that appear fine-tuned, they tend to only do so in particular ways. Setups that give rise to fine-tuning have consequences: supersymmetry, for example, can give rise to an apparently fine-tuned universe but has to have “partner” particles that show up in powerful enough colliders. In general, a theory that gives rise to apparent fine-tuning will have some detectable consequences.

That’s where physicists start to get worried. So far, we haven’t seen any of these detectable consequences, and it’s getting to the point where we could have, had they been the sort many people expected.

Physicists are worried about fine-tuning, but not because it makes the universe “unlikely”. They’re worried because the more finely-tuned our universe appears, the harder it is to find an explanation for it in terms of the sorts of theories we’re used to working with, and the less likely it becomes that someone will discover a good explanation any time soon. We’re quite confident that there should be some explanation, hundreds of years of scientific progress strongly suggest that to be the case. But the nature of that explanation is becoming increasingly opaque.