Apocalypses in math and theory, plus a complexity question
src |
Douglas Adams wrote the watchword for today:
Don’t Panic.
Still, his novel and series The Hitchhiker’s Guide to the Galaxy begins with the impending destruction of the Earth, which goes ahead 5 minutes too soon. The remainder is post-apocalyptic. It is also pre-apocalyptic, because there are multiple Earths that face destruction at different times. At least having multiple days of prophesied doom is something we’ve recently been dealing with.
Today—the last time we can use this word?—we wish to cover real apocalypses in mathematical and scientific theories.
We have already blogged about what would happen to complexity theory if were true and proved. As we said, “Most of the 573 pages of [the] Arora-Barak [textbook] would be gone…” Well this is still hypothetical. Now we will look at cases in the past where whole theories were blown up by a surprise result.
This is different from theories going out of fashion and dying out, even if it was from internal causes. Likewise we don’t consider the fadeouts of past civilizations to be catastrophes, only ones destroyed by things like volcanoes. Ironically, the branch of mathematics called catastrophe theory itself is said to be one of the fadeouts. As mathematical historian David Aubin wrote in “Chapter III: Catastrophes” of his 1998 Princeton PhD thesis:
Catastrophe Theory is dead. Today very, very few scientists identify themselves as ‘catastrophists’; the theory has no institutional basis, department, institute, or journal totally or even partly devoted to it. But do mathematics die?
He goes on to cite an article by Charles Fisher that proclaimed the death of Invariant Theory. To be sure, theories like that sometimes get revived. But first a word about the Mayans and ultimate catastrophe.
Baktun The Future
All the fuss is about today’s ticking over of a Mayan unit of time called a baktun, or more properly b’ak’tun. It’s not even a once-in-5,000-years event like everyone says, but rather once-in-144,000 days, making just over 394 years. The point is there have been 13 of them since the inception of the Mayan creation date according to their “Long Count” calendar, making years in all. So the 14th b’ak’tun starts today—big whoop. The buzz comes from many Mayan inscriptions seeming to max out at 13, but others go as far as 19 and it is known that they counted by 20. Hence the real epoch will be when the 20th and final baktun ticks over to initiate the next piktun. That will be on October 13, 4772. If human civilization lasts that long, that is.
This still has us thinking, what if Earth really were suddenly blown up by Vogons or by Vegans or by a space rock a little bigger than last week’s? What would be left? Anything? The reason is that according to a recently-agreed principle in fundamental physical theory, the answer should be everything.
The principle, as enunciated in small capitals by popular science author Charles Seife in his 2007 book Decoding the Universe, states:
Information can be neither created nor destroyed.
As we mentioned last March, the agreement was symbolized by Stephen Hawking conceding a bet to John Preskill, who has graced these pages. Hawking underscored the point by making it a main part of the plot of a children’s novel written with his daughter Lucy. The father falls into a black hole, but is resurrected by a computer able to piece back all the information because it was all recoverable. At least in theory.
Hence even if Earth really is swallowed up later today, or if we disappear—leaving all our stored information and literary artefacts to decay within a 50,000-year time-span estimated for The History Channel’s series Life After People—all the information would in principle still exist. Is this comforting?
Perhaps not. It could be that while all natural processes conserve information, the more violent ones might embody the computation of a one-way function. It then becomes an issue of complexity theory whether the output of that function could be reverted to its pre-apocalyptic state.
src |
Apocalypses In Math
Here are a few examples from mathematics of “extinction” events: usually the extinction was of a theory or whole approach to math.
Bertrand Russell and Gottlob Frege:
Frege was just finishing his tome on logic when the letter from Russell arrived showing that Frege’s system was inconsistent. The letter basically noticed that the set
was not well-defined. This destroyed the whole book that Frege had worked so hard on for years. Frege’s reaction was recorded in his revised preface:
A scientist can hardly meet with anything more undesirable than to have the foundations give way just as the work is finished. I was put in this position by a letter from Mr. Bertrand Russell when the work was nearly through the press.
A study in being low-key as we might say today.
David Hilbert and Paul Gordan:
Gordan was known as “the king of invariant theory.” His most famous result is that the ring of invariants of binary forms of fixed degree is finitely generated. Hilbert proved his famous theorem that replaced “binary” by any degree, and replacing horribly complex arguments with a beautiful existence proof. To quote Wikipedia:
[This] almost put an end to classical invariant theory for several decades, though the classical epoch in the subject continued to the final publications of Alfred Young, more than 50 years later.
Gordan was less low-key than Frege, since his comment on Hilbert’s brilliant work was:
“This is not mathematics; this is theology.”
Oh well.
Kurt Gödel and David Hilbert:
Hilbert, again, wanted to create a formal foundation of all mathematics based on an axiomatic approach. He had already done this for geometry in his famous work of 1899. No, Euclid did have an axiomatic system thousands of years earlier, but it was not really formal. Some proofs relied on looking at diagrams and other obvious facts, so Hilbert added extra notions that made geometry based on a complete system. For example, Hilbert added the notion of betweenness of three points: point is between and .
Of course Gödel proved via his famous Incompleteness Theorem that what Hilbert could do for geometry was impossible to do for number theory.
Stephen Kleene and Barkley Rosser:
Once Alonzo Church’s lambda calculus and Haskell Curry’s combinators were discovered in the 1930′s, it seemed natural to build systems of logic around them. That was the original intent of both Curry and Church. It was therefore a shock when Kleene and Rosser, as students of Church, showed at a stroke that they were inconsistent. The reason is that the theories’ standard of “well-defined” claimed too extensive a reach, as with Frege’s formalization of the notion of “set.” It essentially allowed defining an exhaustive countable list of well-defined real numbers, for which the Cantor diagonal number was well-defined within the system, a contradiction. Ken likens this paradox phenomenon to the collapse of the Tower of Babel.
Riemann’s Non-Conjecture Refuted:
At the very end of his famous 1859 paper which included the Riemann Hypothesis, Bernhard Riemann made a carefully-worded statement about the relationship between the prime-counting function and the logarithmic integrals and :
Indeed, in the comparison of with the number of prime numbers less than , undertaken by Gauss and Goldschmidt and carried through up to three million, this number has shown itself out to be, in the first hundred thousand, always less than ; in fact the difference grows, with many fluctuations, gradually with .
Further calculations were consistent with inequality holding in general, until in 1914, John Littlewood refuted this not just once, but infinitely often. That is, he did not find a counterexample by computation, but rather proved that must change sign infinitely often. In fact, the first number giving a sign flip is still unknown, though it must be below .
Although this is included on Wikipedia’s short list of disproved mathematical ideas, its significance is not the inequality hypothesis itself, but the fallible nature of numerical evidence. Michael Rubinstein and Peter Sarnak showed an opposite surprise: the set of integers giving a negative sign has non-vanishing density, in fact about 0.00000026, so it is disturbing that no such is within the current range of calculation.
Mertens Conjecture Refuted:
The conjecture, which was first made in 1885 by Thomas Stieltjes not Franz Mertens, states that the sum of the first values of the Möbius function has absolute value at most . That is,
Despite the fact that all computer calculations still support this, Andrew Odlyzko and Herman te Riele disproved it theoretically in 1985. At least it has an exponentially bigger leeway than the previous one: the best known upper bound on a bad is currently
Moreover the following weaker statement, which Stieltjes thought he had proved, is still open:
The reason this is portentious is that the following slight further weakening,
is actually equivalent to the Riemann Hypothesis.
The failure of Riemann would have a definite apocalyptic effect: it would wipe away all the many papers that assume it. It is not clear whether those papers could even be saved by the kind of “relativization” we have in complexity theory, whereby results obtained assuming and so on may still be valid relative to oracle languages such that .
Our Scientific Neighbors’ Houses
Still, the loss of papers assuming Riemann would be nothing compared to what would happen in physics if supersymmetry were disproved, as its failure could take all of string theory down with it. The Standard Model of particle physics seems also to have survived problems the absence of the Higgs Boson would have caused, although issues with the Higgs are still causing apocalyptic reactions from some physicists. At least news today is that other bosons are behaving well according to Scott Aaronson and Alexander Arkhipov’s protocol, which is related to our kind of hierarchy collapse.
Perhaps we in computer science theory and mathematics are fortunate to experience less peril. Even so, we are left with this quotation attributed to Hilbert by Howard Eves:
One can measure the importance of a scientific work by the number of earlier publications rendered superfluous by it.
Open Problems
Do you have other favorite examples of results in the mathematical and general sciences that caused the collapse of theories?
Does Nature compute complexity-theoretic one-way functions?
[fixed Seife quote, minor tweaks]