William Thompson, Lord Kelvin, was a towering genius of 19th century physics. He is often quoted as saying,
There is nothing new to be discovered in physics now. All that remains is more and more precise measurement.
As it happens, he never actually said this. It’s a paraphrase of a quote from Albert Michelson, of the Michelson-Morley Experiment:
While it is never safe to affirm that the future of Physical Science has no marvels in store even more astonishing than those of the past, it seems probable that most of the grand underlying principles have been firmly established and that further advances are to be sought chiefly in the rigorous application of these principles to all the phenomena which come under our notice. It is here that the science of measurement shows its importance — where quantitative work is more to be desired than qualitative work. An eminent physicist remarked that the future truths of physical science are to be looked for in the sixth place of decimals.
In hindsight, this quote looks pretty silly. When Michelson said that “it seems probable that most of the grand underlying principles have been firmly established” he was leaving out special relativity, general relativity, and quantum mechanics. From our perspective, the grandest underlying principles had yet to be discovered!
And yet, I think we should give Michelson some slack.
Someone asked me on twitter recently what I would choose if given the opportunity to unravel one of the secrets of the universe. At the time, I went for the wishing-for-more-wishes answer: I’d ask for a procedure to discover all of the other secrets.
I was cheating, to some extent. But I do think that the biggest and most important mystery isn’t black holes or the big bang, isn’t asking what will replace space-time or what determines the constants in the Standard Model. The most critical, most important question in physics, rather, is to find the consequences of the principles we actually know!
We know our world is described fairly well by quantum field theory. We’ve tested it, not just to the sixth decimal place, but to the tenth. And while we suspect it’s not the full story, it should still describe the vast majority of our everyday world.
If we knew not just the underlying principles, but the full consequences of quantum field theory, we’d understand almost everything we care about. But we don’t. Instead, we’re forced to calculate with approximations. When those approximations break down, we fall back on experiment, trying to propose models that describe the data without precisely explaining it. This is true even for something as “simple” as the distribution of quarks inside a proton. Once you start trying to describe materials, or chemistry or biology, all bets are off.
This is what the vast majority of physics is about. Even more, it’s what the vast majority of science is about. And that’s true even back to Michelson’s day. Quantum mechanics and relativity were revelations…but there are still large corners of physics in which neither matters very much, and even larger parts of the more nebulous “physical science”.
New fundamental principles get a lot of press, but you shouldn’t discount the physics of “the sixth place of decimals”. Most of the big mysteries don’t ask us to challenge our fundamental paradigm: rather, they’re challenges to calculate or measure better, to get more precision out of rules we already know. If a genie gave me the solution to any of physics’ mysteries I’d choose to understand the full consequences of quantum field theory, or even of the physics of Michelson’s day, long before I’d look for the answer to a trendy question like quantum gravity.
In a lot of respects we are there again. And,while there are breakthroughs left to make (dark matter, inflation, quantum gravity, deeper understanding of the whys of the Standard Model, maybe some extremely high energy physics) the revolutions ahead will probably be less paradigm shifting than those of GR, SR and QM.
I would argue that its likely that the opposite is true. Since the SM, Strings, QFT, etc have been studied for decades, with absolutely no progress on anything that matters (such as predicting the mass of the electron), those avenues are likely dead ends. 10,000 PhDs are not wrong, there is nothing more in those directions.
Accuracy – like the oft quoted 10 digits of QED mean little. Newtonian gravity is also extremely accurate, and also conceptually weak, with instant effects, Euclidean space, etc. Indeed the history of science shows that new theories don’t topple older ones because they are instantly more accurate or useful. Old theories have huge inertia. This inertia is amplified by the single global village that the internet has created. If one looks at what Newton, Faraday, Maxwell, Einstein, etc studied – 80% of it was pure bunk. Pure bunk is no longer considered good for an academic career – but what if it is actually required to move forward?
This effect is well known: brain storming sessions are meant to allow ‘crazy wrong’ ideas – in a good brain storming session bad ideas are not to be criticized as they may in the end show a way forward.
The internet causes dangerous groupthink.
Condensed Matter has been studied for decades, centuries if you count back to Maxwell and the like, and it has also made no progress on predicting the mass of the electron. It’s almost as if people whose fields have nothing to do with predicting the mass of the electron are unlikely to predict the mass of the electron. Crazy!
Tom Andersen is well known for trolling about modern physics in the whole blogosphere.
He can be savely ignored or banned and by no means should one feed him 😈
LikeLiked by 1 person
Once long ago, while chasing “decimals”, I noticed that my perturbation series was different from the exact solution expansion starting from the third order. It was a big surprise to me. Finally I figured out how one should build the perturbation theory in that problem and I found the correct expansion (http://arxiv.org/abs/0906.3504 , Chapter 2.1 and Appendix 3). Without having the exact solution, I would have stayed unaware of some dangers of applying the usual perturbation theory rules. It may be important to QFT where we compare our numerical values with experiment solely. Worse, we “measure” the fundamental constants with comparing our series with experimental data, but what if our series is somewhat wrong?
Not sure if this is what you’re going for, but I’d definitely say that understanding nonperturbative physics is extremely important. I think of it as part of the “physics of decimals”, at least in this context, since it’s still a consequence of the rules of QFT, but it wouldn’t have been the sort of thing Michelson would have been thinking of.
In my case, the exact solution exists – it is an analytical function with a transcendental equation for eigenvalues. The eigenfunctions have a different behavior than the zeroth-order eigenfunctions, so the naive spectral summation gives a wrong result, roughly speaking. Fortunately, in my problem one can calculate exactly (but symbolically) the “action” of the perturbation on and then one can represent it as a spectral sum. This gives the correct formula for the expansion parameter.
In case of QFT we do not know whether the exact solution exists, and if it exists, whether it is physical or not. Renormalization and “fitting” the couplings in the truncated perturbative series to the experimental data may not guarantee that the resulting series is a correct solution to a physical problem.
Unless you have experiments that reach to the fine decimals, you can’t even know if your theory is flawed or not. And it does take fine decimals.
For example, in a recent arXiv preprint it took something like 2 billion proton collisions to get enough data on a rare decay mode of a bottom lambda hadron to get experimental data on its branching fraction that had error bars small enough to meaningfully compare to the Standard Model prediction, even though the Standard Model calculation was comparatively straightforward and there was nothing particularly special about the decay mode except that it was a rare decay of a very heavy hadron.
A very large subset of all Standard Model predictions simply cannot be tested without incredibly precise and monotonous experimental effort down to the finest decimals.
I agree. But sometimes it is clear that the theory formulation is bad. QM is about “predicting” probabilities and if the probability of something is high (soft photon emission, for example) and the first Born approximation does not predict it, it means a bad start, physically bad initial approximation. I explained it here: http://arxiv.org/abs/1110.3702
Pingback: Death by a thousand cuts – Geekcologist