Hawking vs. Witten: A Primer

Have you seen the episode of Star Trek where Data plays poker with Stephen Hawking? How about the times he appeared on Futurama or the Simpsons? Or the absurd number of times he has come up in one way or another on The Big Bang Theory?

Stephen Hawking is probably the most recognizable theoretical physicist to laymen. Wheelchair-bound and speaking through a voice synthesizer, Hawking presents a very distinct image, while his work on black holes and the big bang, along with his popular treatments of science in books like A Brief History of Time, has made him synonymous in the public’s mind with genius.

He is not, however, the most recognizable theoretical physicist when talking to physicists. If Sheldon from The Big Bang Theory were a real string theorist he wouldn’t be obsessed with Hawking. He might, however, be obsessed with Edward Witten.

Edward Witten is tall and has an awkwardly high voice (for a sample, listen to the clip here). He’s also smart, smart enough to dabble in basically every subfield of theoretical physics and manage to make important contributions while doing so. He has a knack for digging up ideas from old papers and dredging out the solution to current questions of interest.

And far more than Hawking, he represents a clear target for parody, at least when that parody is crafted by physicists and mathematicians. Abstruse Goose has a nice take on his role in theoretical physics, while his collaboration with another physicist named Seiberg on what came to be known as Seiberg-Witten theory gave rise to the cyber-Witten pun.

If you would look into the mouth of physics-parody madness, let this link be your guide…

So why hasn’t this guy appeared on Futurama? (After all, his dog does!)

Witten is famous among theorists, but he hasn’t done as much as Hawking to endear himself to the general public. He hasn’t written popular science books, and he almost never gives public talks. So when a well-researched show like The Big Bang Theory wants to mention a famous physicist, they go to Hawking, not to Witten, because people know about him. And unless Witten starts interfacing more with the public (or blog posts like this become more common), that’s not about to change.

Perimeter and Patronage

I’m visiting the Perimeter Institute this week. For the non-physicists in the audience, Perimeter is a very prestigious institute of theoretical physics, founded by the founder of BlackBerry. It’s quite swanky. Some first impressions:

  • This occurred to me several times: this place is what the Simons Center wants to be when it grows up.
  • You’d think that the building is impossible to navigate because it was designed by a theoretical physicist, but Freddy Cachazo assured us that he actually had to get the architect to tone down the impossibly ridiculous architecture. Looks like the only person crazier than a physicist is an artist.
  • Having table service at an institute café feels very swanky at first, but it’s actually a lot less practical than cafeteria-style dining. I think the Simons Center Café has it right on this one, even if they don’t quite understand the concept of hurricane relief (don’t have a link for that joke, but I can explain if you’re curious).
  • Perimeter has some government money, but much of its funding comes from private companies and foundations, particularly Research in Motion (or RIM, now BlackBerry). Incidentally, I’m told that PeRIMeter is supposed to be a reference to RIM.

What interests me is that you don’t see this sort of thing (private support) very often in other fields. Private donors will found efforts to solve some real-world problem, like autism or income inequality. They rarely fund basic research*. When they do fund basic research, it’s usually at a particular university. Something like Perimeter, a private institute for basic research, is rather unusual. Perimeter itself describes its motivation as something akin to a long-range strategic investment, but I think this also ties back to the concept of patronage.

Like art, physics has a history of being a fashionable thing for wealthy patrons to support, usually when the research topic is in line with their wider interests. Newton, for example, re-cast his research in terms of its implications for an understanding of the tides to interest the nautically-minded King James II, despite the fact that he couldn’t predict the tides any better than anyone else in his day. Much like supporting art, supporting physics can allow someone’s name to linger on through history, while not running a risk of competing with others’ business interests like research in biology or chemistry might.

A man who liked his sailors

*basic research is a term scientists use to refer to research that isn’t made with a particular application in mind. In terms of theoretical physics, this often means theories that aren’t “true”.

Achieving Transcendence: The Physicist Way

I wanted to shed some light on something I’ve been working on recently, but I realized that a little background was needed to explain some of the ideas. As such, this post is going to be a bit more math-y than usual, but I hope it’s educational!

Pi is special. Familiar to all through the area of a circle \pi r^2, pi is particularly interesting in that you cannot write an algebra equation made up of whole numbers whose solution is pi. While you can easily get fractions (3x=4 gives x=\frac{4}{3}) and even many irrational numbers (x^2=2 gives x=\sqrt{2}), pi is one of a set of numbers that it is impossible to get. These special numbers transcend other numbers, in that you cannot use more everyday numbers to get to them, and as such mathematicians call them transcendental numbers.

In addition to transcendental numbers, you can have transcendental functions. Transcendental functions are functions that can take in a normal number and produce a transcendental number. For example, you may be aware of the delightful equation below:

e^{i \pi}=-1

We can manipulate both sides of this equation by taking the natural logarithm, \ln, to find

i\pi=\ln(-1)

This tells us that the natural logarithm function can take a (negative) whole number (-1) and give us a transcendental number (pi). This means that the natural logarithm is a transcendental function.

There are many other transcendental functions. In addition to logarithms, there are a whole host of related functions called the polylogarithms, and even more generally the harmonic polylogarithms. All of these functions can take in whole numbers like -1 or 1 and give transcendental numbers.

Here physicists introduce a concept called degree of transcendentality, or transcendental weight, which we use to measure how transcendental a number or a function is. Pi (and functions that can give pi, like the natural logarithm) have transcendental weight one. Pi squared has transcendental weight two. Pi cubed (and another number called \zeta(3)) have transcendental weight three. And so on.

Note here that, according to mathematicians, there is no rigorous way that a number can be “more transcendental” than another number. In the case of some of these numbers, like \zeta(5), it hasn’t even been proven that the number is actually transcendental at all! However, physicists still use the concept of transcendental weight because it allows us to classify and manipulate a common and useful set of functions. This is an example of the differences in methods and standards between physicists and mathematicians, even when they are working on similar things.

In what way are these functions common and useful? Well it turns out that in N=4 super Yang-Mills many calculated results are not only made up of these polylogarithms, they have a particular (fixed) transcendental weight. In situations when we expect this to be true, we can use our knowledge to guess most, or even all, of the result without doing direct calculations. That’s immensely useful, and it’s a big part of what I’ve been doing recently.

Model-Hypothesis-Experiment: Sure, Just Not All the Same Person!

At some point, we were all taught how science works.

The scientific method gets described differently in different contexts, but it goes something like this:

First, a scientist proposes a model, a potential explanation for how something out in the world works. They then create a hypothesis, predicting some unobserved behavior that their model implies should exist. Finally, they perform an experiment, testing the hypothesis in the real world. Depending on the results of the experiment, the model is either supported or rejected, and the scientist begins again.

It’s a handy picture. At the very least, it’s a good way to fill time in an introductory science course before teaching the actual science.

But science is a big area. And just as no two sports have the same league setup, no two areas of science use the same method. While the central principles behind the method still hold (the idea that predictions need to be made before experiments are performed, the idea that in order to test a model you need to know something it implies that other models don’t, the idea that the question of whether a model actually describes the real world should be answered by actual experiments…), the way they are applied varies depending on the science in question.

In particular, in high-energy particle physics, we do roughly follow the steps of the method: we propose models, we form hypotheses, and we test them out with experiments. We just don’t expect the same person to do each step!

In high energy physics, models are the domain of Theorists. Occasionally referred to as “pure theorists” to distinguish them from the next category, theorists manipulate theories (some intended to describe the real world, some not). “Manipulate” here can mean anything from modifying the principles of the theory to see what works, to attempting to use the theory to calculate some quantity or another, to proving that the theory has particular properties. There’s quite a lot to do, and most of it can happen without ever interacting with the other areas.

Hypotheses, meanwhile, are the province of Phenomenologists. While theorists often study theories that don’t describe the real world, phenomenologists focus on theories that can be tested. A phenomenologist’s job is to take a theory (either proposed by a theorist or another phenomenologist) and calculate its consequences for experiments. As new data comes in, phenomenologists work to revise their theories, computing just how plausible the old proposals are given the new information. While phenomenologists often work closely with those in the next category, they also do large amounts of work internally, honing calculation techniques and looking through models to find explanations for odd behavior in the data.

That data comes, ultimately, from Experimentalists. Experimentalists run the experiments. With experiments as large as the Large Hadron Collider, they don’t actually build the machines in question. Rather, experimentalists decide how the machines are to be run, then work to analyze the data that emerges. Data from a particle collider or a neutrino detector isn’t neatly labeled by particle. Rather, it involves a vast set of statistics, energies and charges observed in a variety of detectors. An experimentalist takes this data and figures out what particles the detectors actually observed, and from that what sorts of particles were likely produced. Like the other areas, much of this process is self-contained. Rather than being concerned with one theory or another, experimentalists will generally look for general signals that could support a variety of theories (for example, leptoquarks).

If experimentalists don’t build the colliders, who does? That’s actually the job of an entirely different class of scientists, the Accelerator Physicists. Accelerator physicists not only build particle accelerators, they study how to improve them, with research just as self-contained as the other groups.

So yes, we build models, form hypotheses, and construct and perform experiments to test them. And we’ve got very specialized, talented people who focus on each step. That means a lot of internal discussion, and many papers published that only belong to one step or another. For our subfield, it’s the best way we’ve found to get science done.

Sound Bite Management; or the Merits of Shock and Awe

First off, for the small demographic who haven’t seen it already (and aren’t reading this because of it), I wrote an article for Ars Technica. Go read it.

After the article went up, a professor from my department told me that he and several others were concerned about the title.

Now before I go on, I’d like to clarify that this isn’t going to be a story about the department trying to “shut me down” or anything paranoid like that. The professor in question was expressing a valid concern in a friendly way, and it deserves some thought.

The concern was the following: isn’t a title like Earning a PhD by studying a theory that we know is wrong” bad publicity for the field? Regardless of whether the article rebuts the idea that “wrong” is a meaningful descriptor for this sort of theory, doesn’t a title like that give fuel to the fire, sharpening the cleavers of the field’s detractors as one commenter put it? In other words, even if it’s a good article, isn’t it a bad sound bite?

It’s worryingly easy for a catchy sound bite to eclipse everything else about a piece. As one commenter pointed out, that’s roughly what happened with Palin’s fruit fly comment itself. And with that in mind, the claim that people are earning PhDs based on “false” theories definitely sounds like the sort of sound bite that could get out of hand in a hurry if the wrong community picked it up.

There is, at least, one major difference between my sound bite and Palin’s. In the political climate of 2008 it was easy to believe that Sarah Palin didn’t understand the concept of fruit fly research. On the other hand, it’s quite a bit less plausible that Ars would air a piece calling most work in theoretical physics useless.

In operation here is the old, powerful technique of using a shocking, dissonant headline to lure people in. A sufficiently out-of-character statement won’t be taken at face value; rather, it will inspire readers to dig in to the full article to figure out what they’re missing. This is the principle behind provocateurs in many fields, and while there are always risks, often this is the only way to get people to think about complex issues (Peter Singer often seems to exemplify the risks and rewards of this tactic, just to give an example).

What’s the alternative here? In referring to the theory I study as “wrong”, I’m attempting to bring readers face to face with a common misconception: the idea that every theory in physics is designed to approximate some part of the real world. For the physicists in the audience, this is the public perception that everything in theoretical physics is phenomenology. If we don’t bring this perception to light and challenge it, then we’re sweeping a substantial amount of theoretical physics under the rug for the sake of a simpler message. And that’s risky, because if people don’t understand what physics really is then they’re likely to balk when they glimpse what they think is “illegitimate” physics.

In my view, shocking people by describing my type of physics as not “true” is the best way to teach people about what physicists actually do. But it is risky, and it could easily give people the wrong impression. Only time will tell.

What’s A Graviton? Or: How I Learned to Stop Worrying and Love Quantum Gravity

I’m four gravitons and a grad student. And despite this, I haven’t bothered to explain what a graviton is. It’s time to change that.

Let’s start like we often do, with a quick answer that will take some unpacking:

Gravitons are the force-carrying bosons of gravity.

I mentioned force-carrying bosons briefly here. Basically, a force can either be thought of as a field, or as particles called bosons that carry the effect of that field. Thinking about the force in terms of particles helps, because it allows you to visualize Feynman diagrams. While most forces come from Yang-Mills fields with spin 1, gravity has spin 2.

Now you may well ask, how exactly does this relate to the idea that gravity, unlike other forces, is a result of bending space and time?

First, let’s talk about what it means for space itself to be bent. If space is bent, distances are different than they otherwise would be.

Suppose we’ve got some coordinates: x and y. How do we find a distance? We use the Pythagorean Theorem:

d^2=x^2+y^2

Where d is the full distance. If space is bent, the formula changes:

d^2=g_{x}x^2+g_{y}y^2

Here g_{x} and g_{y} come from gravity. Normally, they would depend on x and y, modifying the formula and thus “bending” space.

Let’s suppose instead of measuring a distance, we want to measure the momentum of some other particle, which we call \phi because physicists are overly enamored of Greek letters. If p_{x,\phi} is its momentum (physicists also really love subscripts), then its total momentum can be calculated using the Pythagorean Theorem as well:

p_\phi^2= p_{x,\phi}^2+ p_{y,\phi}^2

Or with gravity:

p_\phi^2= g_{x}p_{x,\phi}^2+ g_{y} p_{y,\phi}^2

At the moment, this looks just like the distance formula with a bunch of extra stuff in it. Interpreted another way, though, it becomes instructions for the interactions of the graviton. If g_{x} and g_{y} represent the graviton, then this formula says that one graviton can interact with two \phi particles, like so:

graviton

Saying that gravitons can interact with \phi particles ends up meaning the same thing as saying that gravity changes the way we measure the \phi particle’s total momentum. This is one of the more important things to understand about quantum gravity: the idea that when people talk about exotic things like “gravitons”, they’re really talking about the same theory that Einstein proposed in 1916. There’s nothing scary about describing gravity in terms of particles just like the other forces. The scary bit comes later, as a result of the particular way that quantum calculations with gravity end up. But that’s a tale for another day.

What if there’s nothing new?

In the weeks after the folks at the Large Hadron Collider announced that they had found the Higgs, people I met would ask if I was excited. After all, the Higgs was what particle physicists were searching for, right?

 As usual in this blog, the answer is “Not really.”

We were all pretty sure the Higgs had to exist; we just didn’t know what its mass would be. And while many people had predictions for what properties the Higgs might have (including some string theorists), fundamentally they were interested for other reasons.

Those reasons, for the most part, are supersymmetry. If the Higgs had different properties than we expected, it could be evidence for one or another proposed form of supersymmetry. Supersymmetry is still probably the best explanation for dark matter, and it’s necessary in some form or another for string theory. It also helps with other goals of particle physics, like unifying the fundamental forces and getting rid of fine-tuned parameters.

Fundamentally, though, the Higgs isn’t likely to answer these questions. To get enough useful information we’ll need to discover an actual superpartner particle. And so far…we haven’t.

That’s why we’re not all that excited about the Higgs anymore. And that’s why, increasingly, particle physics is falling into doom and gloom.

Sure, when physicists talk about the situation, they’re quick to claim that they’re just as hopeful as ever. We still may well see supersymmetry in later runs of the LHC, as it still has yet to reach its highest energies. But people are starting, quietly and behind closed doors, to ask: what if we don’t?

What happens if we don’t see any new particles in the LHC?

There are good mathematical reasons to think that some form of supersymmetry holds. Even if we don’t see supersymmetric particles in the LHC, they may still exist. We just won’t know anything new about them.

That’s a problem.

We’ve been spinning our wheels for too long, and it’s becoming more and more obvious. With no new information from experiments, it’s not clear what we can do anymore.

And while, yes, many theorists are studying theories that aren’t true, sometimes without even an inkling of a connection to the real world, we’re all part of the same zeitgeist. We may not be studying reality itself, but at least we’re studying parts of reality, rearranged in novel ways. Without the support of experiment the rest of the field starts to decay. And one by one, those who can are starting to leave.

Despite how it may seem, most of physics doesn’t depend on supersymmetry. If you’re investigating novel materials, or the coolest temperatures ever achieved, or doing other awesome things with lasers, then the LHC’s failure to find supersymmetry will mean absolutely nothing to you. It’s only a rather small area of physics that will progressively fall into self-doubt until the only people left are the insane or the desperate.

But those of us in that area? If there really is nothing new? Yeah, we’re screwed.

Physics and its (Ridiculously One-Sided) Search for a Nemesis

Maybe it’s arrogance, or insecurity. Maybe it’s due to viewing themselves as the arbiters of good and bad science. Perhaps it’s just because, secretly, every physicist dreams of being a supervillain.

Physicists have a rivalry, you see. Whether you want to call it an archenemy, a nemesis, or even a kismesis, there is another field of study that physicists find so antithetical to everything they believe in that it crops up in their darkest and most shameful dreams.

What field of study? Well, pretty much all of them, actually.

Won’t you be my Kismesis?

Chemistry

A professor of mine once expressed the following sentiment:

“I have such respect for chemists. They accomplish so many things, while having no idea what they are doing!”

Disturbingly enough, he actually meant this as a compliment. Physicists’ relationship with chemists is a bit like a sibling rivalry. “Oh, isn’t that cute! He’s just playing with chemicals. Little guy doesn’t know anything about atoms, and yet he’s just sluggin’ away…wait, why is it working? What? How did you…I mean, I could have done that. Sure.”

Biology

They study all that weird, squishy stuff. They get to do better mad science. And somehow they get way more funding than us, probably because the government puts “improving lives” over “more particles”. Luckily, we have a solution to the problem.

Mathematics

Saturday Morning Breakfast Cereal has a pretty good take on this. Mathematicians are rigorous…too rigorous. They never let us have any fun, even when it’s totally fine, and everyone thinks they’re better than us. Well they’re not! Neener neener.

Computer Science

I already covered math, didn’t I?

Engineering

Think about how mathematicians think about physicists, and you’ll know how physicists think about engineers. They mangle our formulas, ignoring our pristine general cases for silly criteria like “ease of use” and “describing the everyday world”. Just lazy!

Philosophy

What do these guys even study? I mean, what’s the point of metaphysics? We’ve covered that, it’s called physics! And why do they keep asking what quantum mechanics means?

These guys have an annoying habit of pointing out moral issues with things like nuclear power plants and worry entirely too much about world-destroying black holes. They’re also our top competition for GRE scores.

Economics

So, what do you guys use real analysis for again? Pretending to be math-based science doesn’t make you rigorous, guys.

Psychology

We point out that surveys probably don’t measure anything, and that you can’t take the average of “agree” and “strongly agree”. Plus, if you’re a science, where is your F=ma?

They point out that we don’t actually know anything about how psychology research actually works, and that we seem to think that all psychologists are Freud. Then they ask us to look at just how fuzzy the plots we get from colliders actually are.

The argument escalates from there, often ending with frenzied makeout sessions.

Geology?  Astronomy?

Hey, we want a nemesis, but we’re not that desperate.eyH

A physicist by any other trade

Physicists have a tendency to stick their noses in other peoples’ work. We’ve conquered Wall Street (and maybe ruined it), studied communication networks and neural networks, and in a surprising number of cases turned from the study of death to the study of life. Pretty much everyone in physics knows someone who left physics to work on something more interesting, or better-funded, or just straight-up more lucrative. Occasionally, they even remember their roots.

What about the reverse, though? Where are the stories of people in other fields taking up physics?

Aside from a few very early-career examples, that just doesn’t happen. You might say that’s just because physics is hard, but that would be discounting the challenges present in other fields. A better point is that physics is hard, and old.

 Physics is arguably the oldest science, with only a few fields like mathematics and astronomy having claim to an older pedigree. A freshman physics student spends their first semester studying ideas that would have been recognizable three hundred years ago.

Of course, the same (and more) could be said about philosophy. The difference is that in physics, we teach ideas from three hundred years ago because we need them to teach ideas from two hundred years ago. And the ideas from two hundred years ago are only there so we can fill them in with information from a hundred years ago. The purpose of an education in physics, in a sense, is to catch students up with the last three hundred years of work in as concise a manner as possible.

Naturally, this leads to a lot of shortcuts, and over the years an enormous amount of notational cruft has built up around the field, to the point where nothing can be understood without understanding the last three hundred years. In a field where just getting students used to the built-up lingo takes an entire undergraduate education, it’s borderline impossible to just pick it up in the middle and expect to make progress.

Of course, this only explains why people who were trained in other fields don’t take up physics mid-career. What about physicists who go over to other fields? Do they ever come back?

I can’t think of any examples, but I can’t think of a good reason either. Maybe it’s hard to get back in to physics after you’ve been gone for a while. Maybe other fields are just so fun, or physics so miserable, no-one ever wants to come back. We shall never know.

There’s something about Symmetry…

Physicists talk a lot about symmetry. Listen to an article about string theory and you might get the idea that symmetry is some sort of mysterious, mystical principle of beauty, inexplicable to the common man or woman.

Well, if it was inexplicable, I wouldn’t be blogging about it, now would I?

Symmetry in physics is dead simple. At the same time, it’s a bit misleading.

When you think of symmetry, you probably think of objects: symmetric faces, symmetric snowflakes, symmetric sculptures. Symmetry in physics can be about objects, but it can also be about places: symmetry is the idea that if you do an experiment from a different point of view, you should get the same results. In a way, this is what makes all of physics possible: two people in two different parts of the world can do the same experiment, but because of symmetry they can compare results and agree on how the world works.

Of course, if that was all there was to symmetry then it would hardly have the mystical reputation it does. The exciting, beautiful, and above all useful thing about symmetry is that, whenever there is a symmetry, there is a conservation law.

A conservation law is a law of physics that states that some quantity is conserved, that is, cannot be created or destroyed, but merely changed from one form to another. Energy is the classic example: you can’t create energy out of nothing, but you can turn the potential energy of gravity on top of a hill into the kinetic energy of a rolling ball, or the chemical energy of coal into the electrical energy in your power lines.

The fact that every symmetry creates a conservation law is not obvious. Proving it in general and describing how it works required a major breakthrough in mathematics. It was worked out by Emmy Noether, one of the greatest minds of her time, which given that her time included Einstein says rather a lot. Noether struggled for most of her life with the male-dominated establishment of academia, and spent many years teaching unpaid and under the names of male faculty, forbidden from being a professor because of her gender.

Why must women always be banished to the Noether regions of physics?

Noether’s proof is remarkable, but if you’re not familiar with the mathematics it won’t mean much to you. If you want to get a feel for the connection between symmetries and conservation laws, you need to go back a bit further. For the best example, we need to go all the way back to the dawn of physics.

Christiaan Huygens was a contemporary of Isaac Newton, and like Noether he was arguably as smart as if not smarter than his more famous colleague. Huygens could be described as the first theoretical physicist. Long before Newton first wrote his three laws of motion, Huygens used thought experiments to prove deep facts about physics, and he did it using symmetry.

In one of Huygens’ thought experiments, two men face each other, one standing on a boat and the other on the bank of a river. The men grab onto each other’s hands, and dangle a ball on a string from each pair of hands. In this way, it is impossible to tell which man is moving each ball.

Stop hitting yourself!

From the man on the bank’s perspective, he moves the two balls together at the same speed, which happens to be the same speed as the river. The balls are the same size, so as far as he can see they should have the same speed afterwards as well.

On the other hand, the man in the boat thinks that he’s only moving one ball. Since the man on the bank is moving one of the balls along at the same speed as the river, from the man on the boat’s perspective that ball is just staying still, while the other ball is moving with twice the speed of the river. If the man on the bank sees the balls bounce off of each other at equal speed, then the man on the boat will see the moving ball stop, and the ball that was staying still start to move with the same speed as the original ball. From what he could see, a moving ball hit a ball at rest, and transferred its entire momentum to the new ball.

Using arguments like these, Huygens developed the idea of conservation of momentum, the idea of a number related to an object’s mass and speed that can never be created or destroyed, only transferred from one object to another. And he did it using symmetry. At heart, his arguments showed that momentum, the mysterious “quantity of motion”, was merely a natural consequence of the fact that two people can look at a situation in two different ways. And it is that fact, and the power that fact has to explain the world, that makes physicists so obsessed with symmetry.