Monthly Archives: July 2018

Journalists Need to Adapt to Preprints, Not Ignore Them

Nature has an article making the rounds this week, decrying the dangers of preprints.

On the surface, this is a bit like an article by foxes decrying the dangers of henhouses. There’s a pretty big conflict of interest when a journal like Nature, that makes huge amounts of money out of research scientists would be happy to publish for free, gets snippy about scientists sharing their work elsewhere. I was expecting an article about how “important” the peer review process is, how we can’t just “let anyone” publish, and the like.

Instead, I was pleasantly surprised. The article is about a real challenge, the weakening of journalistic embargoes. While this is still a problem I think journalists can think their way around, it’s a bit subtler than the usual argument.

For the record, peer review is usually presented as much more important than it actually is. When a scientific article gets submitted to a journal, it gets sent to two or three experts in the field for comment. In the best cases, these experts read the paper carefully and send criticism back. They don’t replicate the experiments, they don’t even (except for a few heroic souls) reproduce the calculations. That kind of careful reading is important, but it’s hardly unique: it’s something scientists do on their own when they want to build off of someone else’s paper, and it’s what good journalists get when they send a paper to experts for comments before writing an article. If peer review in a journal is important, it’s to ensure that this careful reading happens at least once, a sort of minimal evidence that the paper is good enough to appear on a scientist’s CV.

The Nature article points out that peer review serves another purpose, specifically one of delay. While a journal is preparing to publish an article they can send it out to journalists, after making them sign an agreement (an embargo) that they won’t tell the public until the journal publishes. This gives the journalists a bit of lead time, so the more responsible ones can research and fact-check before publishing.

Open-access preprints cut out the lead time. If the paper just appears online with no warning and no embargoes, journalists can write about it immediately. The unethical journalists can skip fact-checking and publish first, and the ethical ones have to follow soon after, or risk publishing “old news”. Nobody gets the time to properly vet, or understand, a new paper.

There’s a simple solution I’ve seen from a few folks on Twitter: “Don’t be an unethical journalist!” That doesn’t actually solve the problem though. The question is, if you’re an ethical journalist, but other people are unethical journalists, what do you do?

Apparently, what some ethical journalists do is to carry on as if preprints didn’t exist. The Nature article describes journalists who, after a preprint has been covered extensively by others, wait until a journal publishes it and then cover it as if nothing had happened. The article frames this as virtuous, but doomed: journalists sticking to their ethics even if it means publishing “old news”.

To be 100% clear here, this is not virtuous. If you present a paper’s publication in a journal as news, when it was already released as a preprint, you are actively misleading the public. I can’t count the number of times I’ve gotten messages from readers, confused because they saw a scientific result covered again months later and thought it was new. It leads to a sort of mental “double-counting”, where the public assumes that the scientific result was found twice, and therefore that it’s more solid. Unless the publication itself is unexpected (something that wasn’t expected to pass peer review, or something controversial like Mochizuki’s proof of the ABC conjecture) mere publication in a journal of an already-public result is not news.

What science journalists need to do here is to step back, and think about how their colleagues cover stories. Current events these days don’t have embargoes, they aren’t fed through carefully managed press releases. There’s a flurry of initial coverage, and it gets things wrong and misses details and misleads people, because science isn’t the only field that’s complicated, real life is complicated. Journalists have adapted to this schedule, mostly, by specializing. Some journalists and news outlets cover breaking news as it happens, others cover it later with more in-depth analysis. Crucially, the latter journalists don’t present the topic as new. They write explicitly in the light of previous news, as a response to existing discussion. That way, the public isn’t misled, and their existing misunderstandings can be corrected.

The Nature article brings up public health, and other topics where misunderstandings can do lasting damage, as areas where embargoes are useful. While I agree, I would hope many of these areas would figure out embargoes on their own. My field certainly does: the big results of scientific collaborations aren’t just put online as preprints, they’re released only after the collaboration sets up its own journalistic embargoes, and prepares its own press releases. In a world of preprints, this sort of practice needs to happen for important controversial public health and environmental results as well. Unethical scientists might still release too fast, to keep journalists from fact-checking, but they could do that anyway, without preprints. You don’t need a preprint to call a journalist on the phone and claim you cured cancer.

As open-access preprints become the norm, journalists will have to adapt. I’m confident they will be able to, but only if they stop treating science journalism as unique, and start treating it as news. Science journalism isn’t teaching, you’re not just passing down facts someone else has vetted. You’re asking the same questions as any other journalist: who did what? And what really happened? If you can do that, preprints shouldn’t be scary.

The Physics Isn’t New, We Are

Last week, I mentioned the announcement from the IceCube, Fermi-LAT, and MAGIC collaborations of high-energy neutrinos and gamma rays detected from the same source, the blazar TXS 0506+056. Blazars are sources of gamma rays, thought to be enormous spinning black holes that act like particle colliders vastly more powerful than the LHC. This one, near Orion’s elbow, is “aimed” roughly at Earth, allowing us to detect the light and particles it emits. On September 22, a neutrino with energy around 300 TeV was detected by IceCube (a kilometer-wide block of Antarctic ice stuffed with detectors), coming from the direction of TXS 0506+056. Soon after, the satellite Fermi-LAT and ground-based telescope MAGIC were able to confirm that the blazar TXS 0506+056 was flaring at the time. The IceCube team then looked back, and found more neutrinos coming from the same source in earlier years. There are still lingering questions (Why didn’t they see this kind of behavior from other, closer blazars?) but it’s still a nice development in the emerging field of “multi-messenger” astronomy.

It also got me thinking about a conversation I had a while back, before one of Perimeter’s Public Lectures. An elderly fellow was worried about the LHC. He wondered if putting all of that energy in the same place, again and again, might do something unprecedented: weaken the fabric of space and time, perhaps, until it breaks? He acknowledged this didn’t make physical sense, but what if we’re wrong about the physics? Do we really want to take that risk?

At the time, I made the same point that gets made to counter fears of the LHC creating a black hole: that the energy of the LHC is less than the energy of cosmic rays, particles from space that collide with our atmosphere on a regular basis. If there was any danger, it would have already happened. Now, knowing about blazars, I can make a similar point: there are “galactic colliders” with energies so much higher than any machine we can build that there’s no chance we could screw things up on that kind of scale: if we could, they already would have.

This connects to a broader point, about how to frame particle physics. Each time we build an experiment, we’re replicating something that’s happened before. Our technology simply isn’t powerful enough to do something truly unprecedented in the universe: we’re not even close! Instead, the point of an experiment is to reproduce something where we can see it. It’s not the physics itself, but our involvement in it, our understanding of it, that’s genuinely new.

The IceCube experiment itself is a great example of this: throughout Antarctica, neutrinos collide with ice. The only difference is that in IceCube’s ice, we can see them do it. More broadly, I have to wonder how much this is behind the “unreasonable effectiveness of mathematics”: if mathematics is just the most precise way humans have to communicate with each other, then of course it will be effective in physics, since the goal of physics is to communicate the nature of the world to humans!

There may well come a day when we’re really able to do something truly unprecedented, that has never been done before in the history of the universe. Until then, we’re playing catch-up, taking laws the universe has tested extensively and making them legible, getting humanity that much closer to understanding physics that, somewhere out there, already exists.

Conferences Are Work! Who Knew?

I’ve been traveling for over a month now, from conference to conference, with a bit of vacation thrown in at the end.

(As such, I haven’t had time to read up on the recent announcement of the detection of neutrinos and high-energy photons from a blazar, Matt Strassler has a nice piece on it.)

One thing I didn’t expect was how exhausting going to three conferences in a row would be. I didn’t give any talks this time around, so I thought I was skipping the “work” part. But sitting in a room for talk after talk, listening and taking notes, turns out to still be work! There’s effort involved in paying attention, especially in a scientific talk where the details matter. You assess the talks in your head, turning concepts around and thinking about what you might do with them. It’s the kind of thing you don’t notice for a seminar or two, but at a conference, after a while, it really builds up. After three, let’s just say I’ve really needed this vacation. I’ll be back at work next week, and maybe I’ll have a longer blog post for you folks. Until then, I ought to get some rest!

Why a New Particle Matters

A while back, when the MiniBoone experiment announced evidence for a sterile neutrino, I was excited. It’s still not clear whether they really found something, here’s an article laying out the current status. If they did, it would be a new particle beyond those predicted by the Standard Model, something like the neutrinos but which doesn’t interact with any of the fundamental forces except gravity.

At the time, someone asked me why this was so exciting. Does it solve the mystery of dark matter, or any other long-standing problems?

The sterile neutrino MiniBoone is suggesting isn’t, as far as I’m aware, a plausible candidate for dark matter. It doesn’t solve any long-standing problems (for example, it doesn’t explain why the other neutrinos are so much lighter than other particles). It would even introduce new problems of its own!

It still matters, though. One reason, which I’ve talked about before, is that each new type of particle implies a new law of nature, a basic truth about the universe that we didn’t know before. But there’s another reason why a new particle matters.

There’s a malaise in particle physics. For most of the twentieth century, theory and experiment were tightly linked. Unexpected experimental results would demand new theory, which would in turn suggest new experiments, driving knowledge forward. That mostly stopped with the Standard Model. There are a few lingering anomalies, like the phenomena we attribute to dark matter, that show the Standard Model can’t be the full story. But as long as every other experiment fits the Standard Model, we have no useful hints about where to go next. We’re just speculating, and too much of that warps the field.

Critics of the physics mainstream pick up on this, but I’m not optimistic about what I’ve seen of their solutions. Peter Woit has suggested that physics should emulate the culture of mathematics, caring more about rigor and being more careful to confirm things before speaking. The title of Sabine Hossenfelder’s “Lost in Math” might suggest the opposite, but I get the impression she’s arguing for something similar: that particle physicists have been using sloppy arguments and should clean up their act, taking foundational problems seriously and talking to philosophers to help clarify their ideas.

Rigor and clarity are worthwhile, but the problems they’ll solve aren’t the ones causing the malaise. If there are problems we can expect to solve just by thinking better, they’re problems that we found by thinking in the first place: quantum gravity theories that stop making sense at very high energies, paradoxical thought experiments with black holes. There, rigor and clarity can matter: to some extent they’re already there, but I can appreciate the argument that it’s not yet nearly enough.

What rigor and clarity won’t do is make physics feel (and function) like it did in the twentieth century. For that, we need new evidence: experiments that disobey the Standard Model, and do it in a clear enough way that we can’t just chalk it up to predictable errors. We need a new particle, or something like it. Without that, our theories are most likely underdetermined by the data, and anything we propose is going to be subjective. Our subjective judgements may get better, we may get rid of the worst-justified biases, but at the end of the day we still won’t have enough information to actually make durable progress.

That’s not a popular message, in part, because it’s not something we can control. There’s a degree of helplessness in realizing that if nature doesn’t throw us a bone then we’ll probably just keep going in circles forever. It’s not the kind of thing that lends itself to a pithy blog post.

If there’s something we can do, it’s to keep our eyes as open as possible, to make sure we don’t miss nature’s next hint. It’s why people are getting excited about low-energy experiments, about precision calculations, about LIGO. Even this seemingly clickbaity proposal that dark matter killed the dinosaurs is motivated by the same sort of logic: if the only evidence for dark matter we have is gravitational, what can gravitational evidence tell us about what it’s made of? In each case, we’re trying to widen our net, to see new phenomena we might have missed.

I suspect that’s why this reviewer was disappointed that Hossenfelder’s book lacked a vision for the future. It’s not that the book lacked any proposals whatsoever. But it lacked this kind of proposal, of a new place to look, where new evidence, and maybe a new particle, might be found. Without that we can still improve things, we can still make progress on deep fundamental mathematical questions, we can kill off the stupidest of the stupid arguments. But the malaise won’t lift, we won’t get back to the health of twentieth century physics. For that, we need to see something new.