Tag Archives: condensed matter

Some FAQ for Microsoft’s Majorana 1 Chip

Recently, Microsoft announced a fancy new quantum computing chip called Majorana 1. I’ve noticed quite a bit of confusion about what they actually announced, and while there’s a great FAQ page about it on the quantum computing blog Shtetl Optimized, the post there aims at a higher level, assuming you already know the basics. You can think of this post as a complement to that one, that tries to cover some basic things Shtetl Optimized took for granted.

Q: In the announcement, Microsoft said:

“It leverages the world’s first topoconductor, a breakthrough type of material which can observe and control Majorana particles to produce more reliable and scalable qubits, which are the building blocks for quantum computers.”

That sounds wild! Are they really using particles in a computer?

A: All computers use particles. Electrons are particles!

Q: You know what I mean!

A: You’re asking if these are “particle physics” particles, like the weird types they try to observe at the LHC?

No, they’re not.

Particle physicists use a mathematical framework called quantum field theory, where particles are ripples in things called quantum fields that describe properties of the universe. But they aren’t the only people to use that framework. Instead of studying properties of the universe you can study properties of materials, weird alloys and layers of metal and crystal that do weird and useful things. The properties of these materials can be approximately described with the same math, with quantum fields. Just as the properties of the universe ripple to produce particles, these properties of materials ripple to produce what are called quasiparticles. Ultimately, these quasiparticles come down to movements of ordinary matter, usually electrons in the original material. They’re just described with a kind of math that makes them look like their own particles.

Q: So, what are these Majorana particles supposed to be?

A: In quantum field theory, most particles come with an antimatter partner. Electrons, for example, have partners called positrons, with a positive electric charge instead of a negative one. These antimatter partners have to exist due to the math of quantum field theory, but there is a way out: some particles are their own antimatter partner, letting one particle cover both roles. This happens for some “particle physics particles”, but all the examples we’ve found are a type of particle called a “boson”, particles related to forces. In 1937, the physicist Ettore Majorana figured out the math you would need to make a particle like this that was a fermion instead, the other main type of particle that includes electrons and protons. So far, we haven’t found one of these Majorana fermions in nature, though some people think the elusive neutrino particles could be an example. Others, though, have tried instead to find a material described by Majorana’s theory. This should in principle be easier, you can build a lot of different materials after all. But it’s proven quite hard for people to do. Back in 2018, Microsoft claimed they’d managed this, but had to retract the claim. This time, they seem more confident, though the scientific community is still not convinced.

Q: And what’s this topoconductor they’re talking about?

A: Topoconductor is short for topological superconductor. Superconductors are materials that conduct electricity much better than ordinary metals.

Q: And, topological means? Something about donuts, right?

A: If you’ve heard anything about topology, you’ve heard that it’s a type of mathematics where donuts are equivalent to coffee cups. You might have seen an animation of a coffee cup being squished and mushed around until the ring of the handle becomes the ring of a donut.

This isn’t actually the important part of topology. The important part is that, in topology, a ball is not equivalent to a donut.

Topology is the study of which things can change smoothly into one another. If you want to change a donut into a ball, you have to slice through the donut’s ring or break the surface inside. You can’t smoothly change one to another. Topologists study shapes of different kinds of things, figuring out which ones can be changed into each other smoothly and which can’t.

Q: What does any of that have to do with quantum computers?

A: The shapes topologists study aren’t always as simple as donuts and coffee cups. They can also study the shape of quantum fields, figuring out which types of quantum fields can change smoothly into each other and which can’t.

The idea of topological quantum computation is to use those rules about what can change into each other to encode information. You can imagine a ball encoding zero, and a donut encoding one. A coffee cup would then also encode one, because it can change smoothly into a donut, while a box would encode zero because you can squash the corners to make it a ball. This helps, because it means that you don’t screw up your information by making smooth changes. If you accidentally drop your box that encodes zero and squish a corner, it will still encode zero.

This matters in quantum computing because it is very easy to screw up quantum information. Quantum computers are very delicate, and making them work reliably has been immensely challenging, requiring people to build much bigger quantum computers so they can do each calculation with many redundant backups. The hope is that topological superconductors would make this easier, by encoding information in a way that is hard to accidentally change.

Q: Cool. So does that mean Microsoft has the best quantum computer now?

A: The machine Microsoft just announced has only a single qubit, the quantum equivalent of just a single bit of computer memory. At this point, it can’t do any calculations. It can just be read, giving one or zero. The hope is that the power of the new method will let Microsoft catch up with companies that have computers with hundred of qubits, and help them arrive faster at the millions of qubits that will be needed to do anything useful.

Q: Ah, ok. But it sounds like they accomplished some crazy Majorana stuff at least, right?

A: Umm…

Read the Shtetl-Optimized FAQ if you want more details. The short answer is that this is still controversial. So far, the evidence they’ve made public isn’t enough to show that they found these Majorana quasiparticles, or that they made a topological superconductor. They say they have more recent evidence that they haven’t published yet. We’ll see.

Congratulations to John Hopfield and Geoffrey Hinton!

The 2024 Physics Nobel Prize was announced this week, awarded to John Hopfield and Geoffrey Hinton for using physics to propose foundational ideas in the artificial neural networks used for machine learning.

If the picture above looks off-center, it’s because this is the first time since 2015 that the Physics Nobel has been given to two, rather than three, people. Since several past prizes bundled together disparate ideas in order to make a full group of three, it’s noteworthy that this year the committee decided that each of these people deserved 1/2 the prize amount, without trying to find one more person to water it down further.

Hopfield was trained as a physicist, working in the broad area known as “condensed matter physics”. Condensed matter physicists use physics to describe materials, from semiconductors to crystals to glass. Over the years, Hopfield started using this training less for the traditional subject matter of the field and more to study the properties of living systems. He moved from a position in the physics department of Princeton to chemistry and biology at Caltech. While at Caltech he started studying neuroscience and proposed what are now known as Hopfield networks as a model for how neurons store memory. Hopfield networks have very similar properties to a more traditional condensed matter system called a “spin glass”, and from what he knew about those systems Hopfield could make predictions for how his networks would behave. Those networks would go on to be a major inspiration for the artificial neural networks used for machine learning today.

Hinton was not trained as a physicist, and in fact has said that he didn’t pursue physics in school because the math was too hard! Instead, he got a bachelor’s degree in psychology, and a PhD in the at the time nascent field of artificial intelligence. In the 1980’s, shortly after Hopfield published his network, Hinton proposed a network inspired by a closely related area of physics, one that describes temperature in terms of the statistics of moving particles. His network, called a Boltzmann machine, would be modified and made more efficient over the years, eventually becoming a key part of how artificial neural networks are “trained”.

These people obviously did something impressive. Was it physics?

In 2014, the Nobel prize in physics was awarded to the people who developed blue LEDs. Some of these people were trained as physicists, some weren’t: Wikipedia describes them as engineers. At the time, I argued that this was fine, because these people were doing “something physicists are good at”, studying the properties of a physical system. Ultimately, the thing that ties together different areas of physics is training: physicists are the people who study under other physicists, and go on to collaborate with other physicists. That can evolve in unexpected directions, from more mathematical research to touching on biology and social science…but as long as the work benefits from being linked to physics departments and physics degrees, it makes sense to say it “counts as physics”.

By that logic, we can probably call Hopfield’s work physics. Hinton is more uncertain: his work was inspired by a physical system, but so are other ideas in computer science, like simulated annealing. Other ideas, like genetic algorithms, are inspired by biological systems: does that mean they count as biology?

Then there’s the question of the Nobel itself. If you want to get a Nobel in physics, it usually isn’t enough to transform the field. Your idea has to actually be tested against nature. Theoretical physics is its own discipline, with several ideas that have had an enormous influence on how people investigate new theories, ideas which have never gotten Nobels because the ideas were not intended, by themselves, to describe the real world. Hopfield networks and Boltzmann machines, similarly, do not exist as physical systems in the real world. They exist as computer simulations, and it is those computer simulations that are useful. But one can simulate many ideas in physics, and that doesn’t tend to be enough by itself to get a Nobel.

Ultimately, though, I don’t think this way of thinking about things is helpful. The Nobel isn’t capable of being “fair”, there’s no objective standard for Nobel-worthiness, and not much reason for there to be. The Nobel doesn’t determine which new research gets funded, nor does it incentivize anyone (except maybe Brian Keating). Instead, I think the best way of thinking about the Nobel these days is a bit like Disney.

When Disney was young, its movies had to stand or fall on their own merits. Now, with so many iconic movies in its history, Disney movies are received in the context of that history. Movies like Frozen or Moana aren’t just trying to be a good movie by themselves, they’re trying to be a Disney movie, with all that entails.

Similarly, when the Nobel was young, it was just another award, trying to reward things that Alfred Nobel might have thought deserved rewarding. Now, though, each Nobel prize is expected to be “Nobel-like”, an analogy between each laureate and the laureates of the past. When new people are given Nobels the committee is on some level consciously telling a story, saying that these people fit into the prize’s history.

This year, the Nobel committee clearly wanted to say something about AI. There is no Nobel prize for computer science, or even a Nobel prize for mathematics. (Hinton already has the Turing award, the most prestigious award in computer science.) So to say something about AI, the Nobel committee gave rewards in other fields. In addition to physics, this year’s chemistry award went in part to the people behind AlphaFold2, a machine learning tool to predict what shapes proteins fold into. For both prizes, the committee had a reasonable justification. AlphaFold2 genuinely is an amazing advance in the chemistry of proteins, a research tool like nothing that came before. And the work of Hopfield and Hinton did lead ideas in physics to have an enormous impact on the world, an impact that is worth recognizing. Ultimately, though, whether or not these people should have gotten the Nobel doesn’t depend on that justification. It’s an aesthetic decision, one that (unlike Disney’s baffling decision to make live-action remakes of their most famous movies) doesn’t even need to impress customers. It’s a question of whether the action is “Nobel-ish” enough, according to the tastes of the Nobel committee. The Nobel is essentially expensive fanfiction of itself.

And honestly? That’s fine. I don’t think there’s anything else they could be doing at this point.

Congratulations to Syukuro Manabe, Klaus Hasselmann, and Giorgio Parisi!

The 2021 Nobel Prize in Physics was announced this week, awarded to Syukuro Manabe and Klaus Hasselmann for climate modeling and Giorgio Parisi for understanding a variety of complex physical systems.

Before this year’s prize was announced, I remember a few “water cooler chats” about who might win. No guess came close, though. The Nobel committee seems to have settled in to a strategy of prizes on a loosely linked “basket” of topics, with half the prize going to a prominent theorist and the other half going to two experimental, observational, or (in this case) computational physicists. It’s still unclear why they’re doing this, but regardless it makes it hard to predict what they’ll do next!

When I read the announcement, my first reaction was, “surely it’s not that Parisi?” Giorgio Parisi is known in my field for the Altarelli-Parisi equations (more properly known as the DGLAP equations, the longer acronym because, as is often the case in physics, the Soviets got there first). These equations are in some sense why the scattering amplitudes I study are ever useful at all. I calculate collisions of individual fundamental particles, like quarks and gluons, but a real particle collider like the LHC collides protons. Protons are messy, interacting combinations of quarks and gluons. When they collide you need not merely the equations describing colliding quarks and gluons, but those that describe their messy dynamics inside the proton, and in particular how those dynamics look different for experiments with different energies. The equation that describes that is the DGLAP equation.

As it turns out, Parisi is known for a lot more than the DGLAP equation. He is best known for his work on “spin glasses”, models of materials where quantum spins try to line up with each other, never quite settling down. He also worked on a variety of other complex systems, including flocks of birds!

I don’t know as much about Manabe and Hasselmann’s work. I’ve only seen a few talks on the details of climate modeling. I’ve seen plenty of talks on other types of computer modeling, though, from people who model stars, galaxies, or black holes. And from those, I can appreciate what Manabe and Hasselmann did. Based on those talks, I recognize the importance of those first one-dimensional models, a single column of air, especially back in the 60’s when computer power was limited. Even more, I recognize how impressive it is for someone to stay on the forefront of that kind of field, upgrading models for forty years to stay relevant into the 2000’s, as Manabe did. Those talks also taught me about the challenge of coupling different scales: how small effects in churning fluids can add up and affect the simulation, and how hard it is to model different scales at once. To use these effects to discover which models are reliable, as Hasselmann did, is a major accomplishment.

The Many Worlds of Condensed Matter

Physics is the science of the very big and the very small. We study the smallest scales, the fundamental particles that make up the universe, and the largest, stars on up to the universe as a whole.

We also study the world in between, though.

That’s the domain of condensed matter, the study of solids, liquids, and other medium-sized arrangements of stuff. And while it doesn’t make the news as often, it’s arguably the biggest field in physics today.

(In case you’d like some numbers, the American Physical Society has divisions dedicated to different sub-fields. Condensed Matter Physics is almost twice the size of the next biggest division, Particles & Fields. Add in other sub-fields that focus on medium-sized-stuff, like those who work on solid state physics, optics, or biophysics, and you get a majority of physicists focused on the middle of the distance scale.)

When I started grad school, I didn’t pay much attention to condensed matter and related fields. Beyond the courses in quantum field theory and string theory, my “breadth” courses were on astrophysics and particle physics. But over and over again, from people in every sub-field, I kept hearing the same recommendation:

“You should take Solid State Physics. It’s a really great course!”

At the time, I never understood why. It was only later, once I had some research under my belt, that I realized:

Condensed matter uses quantum field theory!

The same basic framework, describing the world in terms of rippling quantum fields, doesn’t just work for fundamental particles. It also works for materials. Rather than describing the material in terms of its fundamental parts, condensed matter physicists “zoom out” and talk about overall properties, like sound waves and electric currents, treating them as if they were the particles of quantum field theory.

This tends to confuse the heck out of journalists. Not used to covering condensed matter (and sometimes egged on by hype from the physicists), they mix up the metaphorical particles of these systems with the sort of particles made by the LHC, with predictably dumb results.

Once you get past the clumsy journalism, though, this kind of analogy has a lot of value.

Occasionally, you’ll see an article about string theory providing useful tools for condensed matter. This happens, but it’s less widespread than some of the articles make it out to be: condensed matter is a huge and varied field, and string theory applications tend to be of interest to only a small piece of it.

It doesn’t get talked about much, but the dominant trend is actually in the other direction: increasingly, string theorists need to have at least a basic background in condensed matter.

String theory’s curse/triumph is that it can give rise not just to one quantum field theory, but many: a vast array of different worlds obtained by twisting extra dimensions in different ways. Particle physicists tend to study a fairly small range of such theories, looking for worlds close enough to ours that they still fit the evidence.

Condensed matter, in contrast, creates its own worlds. Pick the right material, take the right slice, and you get quantum field theories of almost any sort you like. While you can’t go to higher dimensions than our usual four, you can certainly look at lower ones, at the behavior of currents on a sheet of metal or atoms arranged in a line. This has led some condensed matter theorists to examine a wide range of quantum field theories with one strange behavior or another, theories that wouldn’t have occurred to particle physicists but that, in many cases, are part of the cornucopia of theories you can get out of string theory.

So if you want to explore the many worlds of string theory, the many worlds of condensed matter offer a useful guide. Increasingly, tools from that community, like integrability and tensor networks, are migrating over to ours.

It’s gotten to the point where I genuinely regret ignoring condensed matter in grad school. Parts of it are ubiquitous enough, and useful enough, that some of it is an expected part of a string theorist’s background. The many worlds of condensed matter, as it turned out, were well worth a look.

Congratulations to Thouless, Haldane, and Kosterlitz!

I’m traveling this week in sunny California, so I don’t have time for a long post, but I thought I should mention that the 2016 Nobel Prize in Physics has been announced. Instead of going to LIGO, as many had expected, it went to David Thouless, Duncan Haldane, and Michael Kosterlitz. LIGO will have to wait for next year.

Thouless, Haldane, and Kosterlitz are condensed matter theorists. While particle physics studies the world at the smallest scales and astrophysics at the largest, condensed matter physics lives in between, explaining the properties of materials on an everyday scale. This can involve inventing new materials, or unusual states of matter, with superconductors being probably the most well-known to the public. Condensed matter gets a lot less press than particle physics, but it’s a much bigger field: overall, the majority of physicists study something under the condensed matter umbrella.

This year’s Nobel isn’t for a single discovery. Rather, it’s for methods developed over the years that introduced topology into condensed matter physics.

Topology often gets described in terms of coffee cups and donuts. In topology, two shapes are the same if you can smoothly change one into another, so a coffee cup and a donut are really the same shape.

mug_and_torus_morphMost explanations stop there, which makes it hard to see how topology could be useful for physics. The missing part is that topology studies not just which shapes can smoothly change into each other, but which things, in general, can change smoothly into each other.

That’s important, because in physics most changes are smooth. If two things can’t change smoothly into each other, something special needs to happen to bridge the gap between them.

There are a lot of different sorts of implications this can have. Topology means that some materials can be described by a number that’s conserved no matter what (smooth) changes occur, leading to experiments that see specific “levels” rather than a continuous range of outcomes. It means that certain physical setups can’t change smoothly into other ones, which protects those setups from changing: an idea people are investigating in the quest to build a quantum computer, where extremely delicate quantum states can be disrupted by even the slightest change.

Overall, topology has been enormously important in physics, and Thouless, Haldane, and Kosterlitz deserve a significant chunk of the credit for bringing it into the spotlight.