Monthly Archives: March 2019

Hexagon Functions V: Seventh Heaven

I’ve got a new paper out this week, a continuation of a story that has threaded through my career since grad school. With a growing collaboration (now Simon Caron-Huot, Lance Dixon, Falko Dulat, Andrew McLeod, and Georgios Papathanasiou) I’ve been calculating six-particle scattering amplitudes in my favorite theory-that-does-not-describe-the-real-world, N=4 super Yang-Mills. We’ve been pushing to more and more “loops”: tougher and tougher calculations that approximate the full answer better and better, using the “word jumble” trick I talked about in Scientific American. And each time, we learn something new.

Now we’re up to seven loops for some types of particles, and six loops for the rest. In older blog posts I talked in megabytes: half a megabyte for three loops, 15 MB for four loops, 300 MB for five loops. I don’t have a number like that for six and seven loops: we don’t store the result in that way anymore, it just got too cumbersome. We have to store it in a simplified form, and even that takes 80 MB.

Some of what we learned has to do with the types of mathematical functions that we need: our “guess” for the result at each loop. We’ve honed that guess down a lot, and discovered some new simplifications along the way. I won’t tell that story here (except to hint that it has to do with “cosmic Galois theory”) because we haven’t published it yet. It will be out in a companion paper soon.

This paper focused on the next step, going from our guess to the correct six- and seven-loop answers. Here too there were surprises. For the last several loops, we’d observed a surprisingly nice pattern: different configurations of particles with different numbers of loops were related, in a way we didn’t know how to explain. The pattern stuck around at five loops, so we assumed it was the real deal, and guessed the new answer would obey it too.

Yes, in our field this counts as surprisingly nice

Usually when scientists tell this kind of story, the pattern works, it’s a breakthrough, everyone gets a Nobel prize, etc. This time? Nope!

The pattern failed. And it failed in a way that was surprisingly difficult to detect.

The way we calculate these things, we start with a guess and then add what we know. If we know something about how the particles behave at high energies, or when they get close together, we use that to pare down our guess, getting rid of pieces that don’t fit. We kept adding these pieces of information, and each time the pattern seemed ok. It was only when we got far enough into one of these approximations that we noticed a piece that didn’t fit.

That piece was a surprisingly stealthy mathematical function, one that hid from almost every test we could perform. There aren’t any functions like that at lower loops, so we never had to worry about this before. But now, in the rarefied land of six-loop calculations, they finally start to show up.

We have another pattern, like the old one but that isn’t broken yet. But at this point we’re cautious: things get strange as calculations get more complicated, and sometimes the nice simplifications we notice are just accidents. It’s always important to check.

Deep physics or six-loop accident? You decide!

This result was a long time coming. Coordinating a large project with such a widely spread collaboration is difficult, and sometimes frustrating. People get distracted by other projects, they have disagreements about what the paper should say, even scheduling Skype around everyone’s time zones is a challenge. I’m more than a little exhausted, but happy that the paper is out, and that we’re close to finishing the companion paper as well. It’s good to have results that we’ve been hinting at in talks finally out where the community can see them. Maybe they’ll notice something new!


Amplitudes in String and Field Theory at NBI

There’s a conference at the Niels Bohr Institute this week, on Amplitudes in String and Field Theory. Like the conference a few weeks back, this one was funded by the Simons Foundation, as part of Michael Green’s visit here.

The first day featured a two-part talk by Michael Green and Congkao Wen. They are looking at the corrections that string theory adds on top of theories of supergravity. These corrections are difficult to calculate directly from string theory, but one can figure out a lot about them from the kinds of symmetry and duality properties they need to have, using the mathematics of modular forms. While Michael’s talk introduced the topic with a discussion of older work, Congkao talked about their recent progress looking at this from an amplitudes perspective.

Francesca Ferrari’s talk on Tuesday also related to modular forms, while Oliver Schlotterer and Pierre Vanhove talked about a different corner of mathematics, single-valued polylogarithms. These single-valued polylogarithms are of interest to string theorists because they seem to connect two parts of string theory: the open strings that describe Yang-Mills forces and the closed strings that describe gravity. In particular, it looks like you can take a calculation in open string theory and just replace numbers and polylogarithms with their “single-valued counterparts” to get the same calculation in closed string theory. Interestingly, there is more than one way that mathematicians can define “single-valued counterparts”, but only one such definition, the one due to Francis Brown, seems to make this trick work. When I asked Pierre about this he quipped it was because “Francis Brown has good taste…either that, or String Theory has good taste.”

Wednesday saw several talks exploring interesting features of string theory. Nathan Berkovitz discussed his new paper, which makes a certain context of AdS/CFT (a duality between string theory in certain curved spaces and field theory on the boundary of those spaces) manifest particularly nicely. By writing string theory in five-dimensional AdS space in the right way, he can show that if the AdS space is small it will generate the same Feynman diagrams that one would use to do calculations in N=4 super Yang-Mills. In the afternoon, Sameer Murthy showed how localization techniques can be used in gravity theories, including to calculate the entropy of black holes in string theory, while Yvonne Geyer talked about how to combine the string theory-like CHY method for calculating amplitudes with supersymmetry, especially in higher dimensions where the relevant mathematics gets tricky.

Thursday ended up focused on field theory. Carlos Mafra was originally going to speak but he wasn’t feeling well, so instead I gave a talk about the “tardigrade” integrals I’ve been looking at. Zvi Bern talked about his work applying amplitudes techniques to make predictions for LIGO. This subject has advanced a lot in the last few years, and now Zvi and collaborators have finally done a calculation beyond what others had been able to do with older methods. They still have a way to go before they beat the traditional methods overall, but they’re off to a great start. Lance Dixon talked about two-loop five-particle non-planar amplitudes in N=4 super Yang-Mills and N=8 supergravity. These are quite a bit trickier than the planar amplitudes I’ve worked on with him in the past, in particular it’s not yet possible to do this just by guessing the answer without considering Feynman diagrams.

Today was the last day of the conference, and the emphasis was on number theory. David Broadhurst described some interesting contributions from physics to mathematics, in particular emphasizing information that the Weierstrass formulation of elliptic curves omits. Eric D’Hoker discussed how the concept of transcendentality, previously used in field theory, could be applied to string theory. A few of his speculations seemed a bit farfetched (in particular, his setup needs to treat certain rational numbers as if they were transcendental), but after his talk I’m a bit more optimistic that there could be something useful there.

Pi Day Alternatives

On Pi Day, fans of the number pi gather to recite its digits and eat pies. It is the most famous of numerical holidays, but not the only one. Have you heard of the holidays for other famous numbers?

Tau Day: Celebrated on June 28. Observed by sitting around gloating about how much more rational one is than everyone else, then getting treated with high-energy tau leptons for terminal pedantry.

Canadian Modular Pi Day: Celebrated on February 3. Observed by confusing your American friends.

e Day: Celebrated on February 7. Observed in middle school classrooms, explaining the wonders of exponential functions and eating foods like eggs and eclairs. Once the students leave, drop tabs of ecstasy instead.

Golden Ratio Day: Celebrated on January 6. Rub crystals on pyramids and write vaguely threatening handwritten letters to every physicist you’ve heard of.

Euler Gamma Day: Celebrated on May 7 by dropping on the floor and twitching.

Riemann Zeta Daze: The first year, forget about it. The second, celebrate on January 6. The next year, January 2. After that, celebrate on New Year’s Day earlier and earlier in the morning each year until you can’t tell the difference any more.

A Field That Doesn’t Read Its Journals

Last week, the University of California system ended negotiations with Elsevier, one of the top academic journal publishers. UC had been trying to get Elsevier to switch to a new type of contract, one in which instead of paying for access to journals they pay for their faculty to publish, then make all the results openly accessible to the public. In the end they couldn’t reach an agreement and thus didn’t renew their contract, cutting Elsevier off from millions of dollars and their faculty from reading certain (mostly recent) Elsevier journal articles. There’s a nice interview here with one of the librarians who was sent to negotiate the deal.

I’m optimistic about what UC was trying to do. Their proposal sounds like it addresses some of the concerns raised here with open-access systems. Currently, journals that offer open access often charge fees directly to the scientists publishing in them, fees that have to be scrounged up from somebody’s grant at the last minute. By setting up a deal for all their faculty together, UC would have avoided that. While the deal fell through, having an organization as big as the whole University of California system advocating open access (and putting the squeeze on Elsevier’s profits) seems like it can only lead to progress.

The whole situation feels a little surreal, though, when I compare it to my own field.

At the risk of jinxing it, my field’s relationship with journals is even weirder than xkcd says.

arXiv.org is a website that hosts what are called “preprints”, which originally meant papers that haven’t been published yet. They’re online, freely accessible to anyone who wants to read them, and will be for as long as arXiv exists to host them. Essentially everything anyone publishes in my field ends up on arXiv.

Journals don’t mind, in part, because many of them are open-access anyway. There’s an organization, SCOAP3, that runs what is in some sense a large-scale version of what UC was trying to set up: instead of paying for subscriptions, university libraries pay SCOAP3 and it covers the journals’ publication costs.

This means that there are two coexisting open-access systems, the journals themselves and arXiv. But in practice, arXiv is the one we actually use.

If I want to show a student a paper, I don’t send them to the library or the journal website, I tell them how to find it on arXiv. If I’m giving a talk, there usually isn’t room for a journal reference, so I’ll give the arXiv number instead. In a paper, we do give references to journals…but they’re most useful when they have arXiv links as well. I think the only times I’ve actually read an article in a journal were for articles so old that arXiv didn’t exist when they were published.

We still submit our papers to journals, though. Peer review still matters, we still want to determine whether our results are cool enough for the fancy journals or only good enough for the ordinary ones. We still put journal citations on our CVs so employers and grant agencies know not only what we’ve done, but which reviewers liked it.

But the actual copy-editing and formatting and publishing, that the journals still employ people to do? Mostly, it never gets read.

In my experience, that editing isn’t too impressive. Often, it’s about changing things to fit the journal’s preferences: its layout, its conventions, its inconvenient proprietary document formats. I haven’t seen them try to fix grammar, or improve phrasing. Maybe my papers have unusually good grammar, maybe they do more for other papers. And maybe they used to do more, when journals had a more central role. But now, they don’t change much.

Sometimes the journal version ends up on arXiv, if the authors put it there. Sometimes it doesn’t. And sometimes the result is in between. For my last paper about Calabi-Yau manifolds in Feynman diagrams, we got several helpful comments from the reviewers, but the journal also weighed in to get us to remove our more whimsical language, down to the word “bestiary”. For the final arXiv version, we updated for the reviewer comments, but kept the whimsical words. In practice, that version is the one people in our field will read.

This has some awkward effects. It means that sometimes important corrections don’t end up on arXiv, and people don’t see them. It means that technically, if someone wanted to insist on keeping an incorrect paper online, they could, even if a corrected version was published in a journal. And of course, it means that a large amount of effort is dedicated to publishing journal articles that very few people read.

I don’t know whether other fields could get away with this kind of system. Physics is small. It’s small enough that it’s not so hard to get corrections from authors when one needs to, small enough that social pressure can get wrong results corrected. It’s small enough that arXiv and SCOAP3 can exist, funded by universities and private foundations. A bigger field might not be able to do any of that.

For physicists, we should keep in mind that our system can and should still be improved. For other fields, it’s worth considering whether you can move in this direction, and what it would cost to do so. Academic publishing is in a pretty bizarre place right now, but hopefully we can get it to a better one.

Hadronic Strings and Large-N Field Theory at NBI

One of string theory’s early pioneers, Michael Green, is currently visiting the Niels Bohr Institute as part of a program by the Simons Foundation. The program includes a series of conferences. This week we are having the first such conference, on Hadronic Strings and Large-N Field Theory.

The bulk of the conference focused on new progress on an old subject, using string theory to model the behavior of quarks and gluons. There were a variety of approaches on offer, some focused on particular approximations and others attempting to construct broader, “phenomenological” models.

The other talks came from a variety of subjects, loosely tied together by the topic of “large N field theories”. “N” here is the number of colors: while the real world has three “colors” of quarks, you can imagine a world with more. This leads to simpler calculations, and often to connections with string theory. Some talks deal with attempts to “solve” certain large-N theories exactly. Others ranged farther afield, even to discussions of colliding black holes.