These are some notes I wrote a couple of years ago on Judith Grabiner’s paper ‘Is Mathematical Truth Time Dependent?’ David Chapman suggested I put them up somewhere public, so here they are with a few tweaks and comments. They’re still *notes*, though – don’t expect proper sentences everywhere! I’m not personally hugely interested in the framing question about mathematical truth, but I really enjoyed the main part of the paper, which compares the ‘if it works, do it’ culture of eighteenth century mathematics to the focus on rigour that came later.

I haven’t read all that much history of mathematics, so I don’t have a lot of context to put this in. If something looks off or oversimplified let me know.

I found this essay in an anthology called *New Directions in the Philosophy of Mathematics*, edited by Thomas Tymoczko. I picked this book up more or less by luck when I was a PhD student and a professor was having a clearout, and I didn’t have high hopes – nothing I’d previously read about philosophy of mathematics had made much sense to me. Platonism, logicism, formalism and the rest all seemed equally bad, and I wasn’t too interested in formal logic and foundations. However, this book promised something different:

The origin of this book was a seminar in the philosophpy of mathematics held at Smith College during the summer of 1979. An informal group of mathematicians, philosophers and logicians met regularly to discuss common concerns about the nature of mathematics. Our meetings were alternately frustrating and stimulating. We were frustrated by the inablility of traditional philosophical formulations to articulate the actual experience of mathematicians. We did not want yet another restatement of the merits and vicissitudes of the various foundational programs – platonism, logicism, formalism and intuitionism. However, we were also frustrated by the difficulty of articulating a viable alternative to foundationalism, a new approach that would speak to mathematicians and philosophers about their common concerns. Our meetings were most exciting when we managed to glimpse an alternative.

There’s plenty of other good stuff in the book, including some famous pieces like Thurston’s classic On proof and progress in mathematics, and a couple of reprinted sections of Lakatos’s *Proofs and Refutations*.

Anyway, here are the notes. Anything I’ve put in quotes is Grabiner. Anything in square brackets is some random tangent I’ve gone off on.

Two “generalizations about the way many eighteenth-century mathematicians worked”:

- “… the primary emphasis was on getting results”. Huge explosion in creativity, but “the chances are good that these results were originally obtained in ways utterly different from the ways we prove them today. It is doubtful that Euler and his contemporaries would have been able to derive their results if they had been burdened with our standards of rigor”.
- “… mathematicians placed great reliance on the power of symbols. Sometimes it seems to have been assumed that if one could just write down something which was symbolically coherent, the truth of the statement was guaranteed.” This extended to e.g. manipulating infinite power series just like very long polynomials.

Euler’s Taylor expansion of starting from the binomial expansion as one example. He takes as infinitely small and as infinitely large, and is happy to assume their product is finite without worrying too much. “The modern reader may be left slightly breathless”, but he gets the right answer.

Trust in symbol manipulation was “somewhat anomalous in the history of mathematics”. Grabiner suggests it came from the recent success of algebra and the calculus. E.g. Leibniz’s notation, which “does the thinking for us” (chain rule as example). This also extended out of maths, e.g. Lavoisier’s idea of ‘chemical algebra’.

18th c *was* interested in foundations (e.g. Berkeley on calculus being insufficiently rigorous) but this was “not the basic concern” and generally was “relegated to Chapter I of textbooks, or found in popularizations”, not in research papers.

This changed in the 19th c beginning with Cauchy and Bolzano – beginnings of rigorous treatments of limits, continuity etc.

## Why did standards change?

“The first explanation which may occur to us is like the one we use to justify rigor to our students today: the calculus was made rigorous to avoid errors, and to correct errors already made.” Doesn’t really hold up – there were surprisingly few mistakes in the 18th c stuff as they “had an almost unerring intuition”.

[I’ve been meaning to look into this for a while, as I get sick of that particular justification being trotted out, always with the same dubious examples. One of these is Weierstrass’s continuous-everywhere-differentiable-nowhere function. This is a genuine example of something the less rigorous approach failed to find, but it came much later, so isn’t what got people started on rigour.

The other example normally given is about something called “the Italian school of algebraic geometry”, which apparently went off the rails in the early 20th c and published false stuff. There’s some information on that in the answers to a MathOverflow question by Kevin Buzzard and the linked email from David Mumford – from a quick read it looks like it was one guy, Severi, who really lost it. Anyway, this is also a lot later than the 18th century.]

It is true though that by the end of the 18th c they were getting into topics – complex functions, multivariable calculus – where “there are many plausible conjectures whose truth is relatively difficult to evaluate intuitively”, so rigour was more useful.

Second possible explanation – need to unify the mass of results thrown up in the 18th c. Probably some truth to this: current methods were hitting diminishing returns, time to “sit back and reflect”.

Third explanation – prior existence of rigour in Euclid’s geometry. Berkeley’s attack on calculus was on this line.

One other interesting factor she suggests – an increasing need for mathematicians to teach (as they became employees of government-sponsored institutions rather than being attached to royal courts). École Polytechnique as model for these.

“Teaching always makes the teacher think carefully about the basis for the subject”. Moving from self-educated or apprentice-master set-ups, where you learn piecemeal from examples of successful thinking, to a more formalised ‘here are the foundations’ approach.

Her evidence – origins of foundational work often emerged from lecture courses. This was true for Lagrange, Cauchy, Weierstrass and Dedekind.

[I don’t know how strong this evidence is, but it’s a really interesting theory. I’ve had literalbanana‘s blog post on indexicality thoroughly stuck in my head for the last month, so I’m seeing that idea everywhere – this is one example. Teaching a large class forces you to take knowledge that was previously highly situated and indexical – ‘oh yes, you need to do this’ – and pull it out into a public form that makes sense to people not deeply immersed in that context. Compare Thurston’s quote in *Proof and progress in mathematics*: “When a significant theorem is proved, it often (but not always) happens that the solution can be communicated in a matter of minutes from one person to another within the subfield. The same proof would be communicated and generally understood in an hour talk to members of the subfield. It would be the subject of a 15- or 20-page paper, which could be read and understood in a few hours or perhaps days by members of the subfield.”]

## How did standards change?

Often 18th c methods were repurposed/generalised. E.g. haphazard comparisons of particular series to the geometric series became Cauchy’s general convergence tests. Or old methods of computing the error term epsilon for the nth approximation get turned round, so that we are *given* epsilon and show we can always find n to beat that error term. This is essentially the definition of convergence we still use today.

## Conclusion

Goes back to original question: is mathematical truth time-dependent? Sets up two bad options to knock down…

- Relativism. “‘Sufficient unto the day is the rigor thereof.’ Mathematical truth is just what the editors of the
*Transactions*say it is.” This wouldn’t explain why Cauchy and Weierstrass were ever unsatisfied in the first place. - MAXIMAL RIGOUR AT ALL TIMES. The 18th c was just sloppy. “According to this high standard, which textbooks sometimes urge on students, Euler would never have written a line.”

[A lot of my grumpiness about rigour is because it was exactly what I *didn’t* need as a first year maths student. I was just exploring the 18th century explosion myself and discovering the power of mathematics, and what I needed right then was to be able to run with that and learn *more cool shit*, without fussing over precise epsilon-delta definitions. Maybe it would have worked for me a couple of years later, if I’d seen enough examples to have come across a situation where rigour was useful. This seems to vary a lot though – David Chapman replied that *lack of rigour* was what he was annoyed by at that age, and he was driven to the library to read about Dedekind cuts.]

… then suggests “a third possibility”:

- A Kuhnian picture where mathematics grows “not only by successive increments, but also by occasional revolutions”. “We can be consoled that most of the old bricks will find places somewhere in the new structure”.

JosephApril 1, 2020 / 3:16 amMaybe mathematicians eventually noticed that ‘if it works, do it’ didn’t work and replaced it with something more rigorous, which worked better, which is in itself and example of ‘if it works, do it’…

LikeLike