Into the Kiln: Insights from Tao's 'Analysis I'
post by TurnTrout · 2018-06-01T18:16:32.616Z · LW · GW · 8 commentsContents
Foreword Analysis I 1: Introduction 2: The Natural Numbers 3: Set Theory Recursive Nesting 4: Integers and Rationals 5: The Real Numbers 6: Limits of Sequences Upper-Bounded Monotonic Sequence Convergence 7: Series Condensation 8: Infinite Sets 9: Continuous Functions on R Lipschitz Continuity ⇎ Uniform Continuity 10: Differentiation of Functions Differential Intuitions pǝɹᴉʌɐʇᴉʌǝs L'Hôpital's Rule 11: The Riemann Integral Pointless Integration Infinite Partitions? Riemann-Stieltjes Confusionn Final Verdict Forwards Tips Marginal Attention Proving Myself None 8 comments
Note: real analysis is not on the MIRI reading list (although I think it should be).
Foreword
As a young boy, mathematics captivated me.
In elementary school, I'd happily while away entire weekends working through the next grade's math book. I was impatient.
In middle school, I'd lazily estimate angles of incidence that would result if I shot lasers from my eyes, tracing their trajectories within the classroom and out down the hallway. I was restless.
In high school, I'd daydream about what would happen to integrals as I twisted functions in my mind. I was curious.
And now, I get to see how it's all put together. Imagine being fascinated by some thing, continually glimpsing beautiful new facets and sampling exotic flavors, yet being resigned to not truly pursuing this passion. After all, I chose to earn a computer science degree.
Wait.
Analysis I
As in Linear Algebra Done Right, I completed every single exercise in the book - this time, without looking up any solutions (although I did occasionally ask questions on Discord). Instead, I came back to problems if I couldn't solve them after half an hour of effort.
A sampling of my proofs can be found here.
1: Introduction
2: The Natural Numbers
In which the Peano axioms are introduced, allowing us to define addition and multiplication on the natural numbers .
3: Set Theory
In which functions and Cartesian products are defined, among other concepts.
Recursive Nesting
How can you apply the axiom of foundation if sets are nested in each other? That is, how can the axiom of foundation "reach into" sets like and ?
Show that if and are two sets, then either or (or both).
Proof. Suppose and . By the pairwise axiom, we know that there exists a set . We see that there does not exist an such that . That is, if we choose , one of its elements is , which is also an element of - this violates the axiom of foundation. The same reasoning applies if we choose . Then , so either or (or both) is not an element of the other.
4: Integers and Rationals
In which the titular sets are constructed, allowing the exploration of absolute value, exponentiation, and the incompleteness (colloquially, the "gaps") of the rationals.
Readers newer to mathematics may find it interesting that even though there are (countably) infinitely many rational numbers between any two distinct rationals, the rationals still contain gaps.
5: The Real Numbers
In which Cauchy sequences allow us to formally construct the reals.
6: Limits of Sequences
In which we meet convergence and its lovely limit laws, extend the reals to cover infinities, experience the delightfully-counterintuitive and , and complete our definition of real exponentiation.
Upper-Bounded Monotonic Sequence Convergence
I tried to come up with a clever title here - I really did. Apparently even my punmaking abilities are bounded.
Suppose you have a monotonically increasing sequence with an upper bound . Then the sequence converges; this also applies to lower-bounded monotonic decreasing sequences.
Weird, right? I mean, even though the sequence monotonically increases and there's an upper bound, there are still uncountably infinitely many "places" the sequence can "choose" to go. So what gives?
Proof. Let and . Suppose that the sequence is not eventually -close to . Let be such that for all , either or ; we know that exists because the sequence is monotone increasing. By the Archimedean principle, there exists some such that .
Since the sequence is monotone increasing, by repeating the above argument times in the first case, we have that , which is contradictory. By repeating the argument times in the second case, we have , which contradicts the fact that the sequence is monotone increasing. Then for any , the sequence must be eventually -close to some . Intuitively, for any given , the sequence can only "escape" a limit a finite number of times before it runs out of room and has to be -close.
Next, we show that the 's form a Cauchy sequence. Let , and set such that . is eventually -close to , so there exists a such that for all we have . Similar arguments hold for . Set , now . But is arbitrary, so we can easily see that the sequence is Cauchy.
As the real numbers are complete, this sequence converges to some . Since the main sequence is eventually -close to , and converges to , by the triangle inequality we have that the main sequence converges to .
7: Series
In which we finally reach concepts from pre-calculus.
Condensation
The Cauchy condensation test says that for a sequence where and for all , the series converges iff the series converges. Using this result, we have that the harmonic series diverges; the partial sums are given below.
What was initially counterintuitive is that even though , the series doesn't converge. The best intuition I've come up with is that the harmonic series doesn't "deplete" its domain quickly enough, so you can get arbitrarily large partial sums.
If you want proofs, here are twenty!
8: Infinite Sets
In which uncountable sets, the axiom of choice, and ordered sets brighten our lives.
9: Continuous Functions on
In which continuity, the maximum principle, and the intermediate value theorem make their debut.
Lipschitz Continuity Uniform Continuity
If a function () is Lipschitz-continuous for some Lipschitz constant , then by definition we have t for every ,
The definition of uniform continuity is
For every , there exists a such that for all such that , .
Lipschitz continuity implies uniform continuity (do you see why?), but the converse is not true. I mean, what kind of twisted person would come up with this kind of function?
10: Differentiation of Functions
In which the basic rules of differential calculus are proven.
You know, I actually thought that I wouldn't have too much to explain in this post - the book went very smoothly up to this point. On the upside, we get to spend even more time together!
Differential Intuitions
Let me simply direct you to this excellent StackExchange answer.
pǝɹᴉʌɐʇᴉʌǝs
We can understand by simply thinking about , which makes sense for the derivative of the inverse!
L'Hôpital's Rule
Consider differentiable on (for real numbers ). Then if for , and , we have that for and .
As a neat exercise, let's see how this rule breaks if we violate preconditions:
- If or , then the ratio is "messed up" and not necessarily indicative of the functions' slopes as is approached.
- If or is not differentiable on , then perhaps
- No, but really - you would use L'Hôpital's rule to analytically determine that the limit in question () does not exist.
- If for some , then we have division by zero (unless , in which case we find more twisted counterexamples which necessitate the closure of this interval).
11: The Riemann Integral
In which partitions and piecewise constant functions help us define the Riemann integral, which later leads to the Riemann-Stieltjes integral and the fundamental theorems of calculus.
Having taken care of the exposition, we arrive at the Rivendell of real analysis, preparing ourselves for the arduous journey to Mt. Lebesgue.
Pointless Integration
There is zero area under a point (or even infinitely many points, such as ) due to how we define length, which in turn allows us to build from piecewise Riemann integrals to the real deal.
Infinite Partitions?
The upper and lower Riemann integrals can be defined as the infimum and supremum of the upper and lower Riemann sums, respectively. It is important to note that even though that for many functions (such as ), further refinement of the partition always gets you closer to the extremum, the result is not an "infinite" partition (which is not defined according to this construction).
Consider the curried function , which takes a partition and computes its corresponding Riemann sum with respect to the predefined function. Then clearly this function is monotonic with respect to the refinement of the partition; the extremum is not necessarily achieved by any given partition in the refinement sequence, but rather the closest bound on what you can get with any partition.
Riemann-Stieltjes Confusionn
The book doesn't lay it out cleanly, so I will: the Riemann-Stieltjes integral allows us to use custom length functions to weight different parts of the function differently. I recommend working through a simple case like in your head: (how do the piecewise constant Riemann-Stieltjes integrals of majorizing and minorizing functions change as you iteratively refine the coarsest partition possible?).
This is particularly useful in defining expectation in statistics:
The Riemann integral is recovered as the special case where .
Final Verdict
Terence Tao is both an incredible mathematician and writer, and it shows. There simply weren't many things which confused me, and that says more about his writing than it does about me. The exercises are appropriate and hew closely to each chapter's content; often, the reader proves key results.
My only complaint is that results are frequently referred to as "from Proposition 3.6.4 and Theorem 3.6.12" many chapters later, forcing the reader to infer the referents or backtrack all of the way. In a sense, this may be helpful - you don't want to backtrack a billion pages, so you try to fill in the blanks. In another sense, it's annoying.
In my opinion, Analysis I belongs near the very beginning of the research guide. It's a wonderful introduction to proof-based mathematics, with a helpful appendix clearing up concepts I had previously only picked up via osmosis. Additionally, I met with a deep learning professor at my university, asking them "if I want to be able to understand and potentially make progress on some of the fundamental issues in machine learning, do I need to know real analysis?". Their answer: a definitive yes.
Forwards
Next up is Analysis II, which works from metric spaces all the way up to the Lebesgue integral. Additionally, I'm going to start working through Sutton and Barto's Reinforcement Learning: An Introduction to increase the rigor with which I think about RL and in preparation for my work this summer.
I also think I need to run through some applied Calculus to refresh my reflexes (so I don't have to rederive every identity that I can't remember).
Tips
- With respect to my gripe about result numbering, writing down both the results and their numbers in your notes may save you some headaches.
- Appendix A is extremely useful for those new to proofs.
- When reading textbooks, your priors should be towards your being wrong - following this intuition will allow you to unlock new abilities, rather than glossing over your (probable) incorrectness and having it blow up in your face later.
Marginal Attention
In the last pages of CFAR's participant handbook is an entry on marginal attention. Essentially, each bit of extra attention contributes more than the last. If you're totally dialed in on a task and get slightly distracted, that is far more disastrous than getting slightly more distracted while your attention is already somewhat unfocused.
I often simply left my phone at home and accompanied my Kindle to an empty classroom. This worked wonderfully; I suspect it more than doubled my hourly learning efficiency.
Proving Myself
Just over three months ago, I wrote [LW · GW]:
Proofs remain inordinately difficult for me, although I have noticed a small improvement. To do MIRI-relevant math, proofs will need to become second nature. Depending on how I feel as I progress through my next book (which will likely be a proof-centric linear algebra tome), I'll start trying different supplemental approaches for improving my proof prowess.
I have resolved that by the completion of my next book review, proofs will be one of my strong suits.
And now, I think that my proofs are getting pretty good. When confronting a problem, I feel a certain mental lightness - a readiness to use my full repertoire of knowledge, to strike true down those paths likely to bring the proof to completion. Although my capacities remain faint shadows of what I hope to achieve within the coming year, my progress has been substantial.
Talk is cheap, and you probably don't feel like navigating to the selection of proofs I provided, so let me share with you my favorite proof from this book.
The following problem was admittedly confusing at first, but I had an overwhelmingly strong sense that the statement had to be true. I thought again and again about why; once that came to me, I wrote it all at once, and beamed.
Local Extrema are Stationary: let be real numbers, and let be a function. If , is differentiable at , and attains either a local maximum or local minimum at , then .
Proof. Suppose is a local maximum; thus, for all , . Let ; we know that exists and is a real number since is differentiable at . By the trichotomy of real numbers, is either negative, positive, or zero.
Suppose that is negative - then consider (this is permissible as the left and right limits of a convergent limit are equal); we have a sequence of which converges to , so every term in is negative.
If has no positive terms, then each term in must be positive, so , contradicting our assumption that . Then by the properties of convergent limits, there must be infinitely many such that is positive. Therefore, these , contradicting the fact that is a local maximum on . Then cannot be negative.
A similar proof holds for , so .
To solve for local minimum , define and use the above result on local maximum .
If you are interested in working with me or others on the task of learning MIRI-relevant math, if you have a burning desire to knock the alignment problem down a peg - I would be more than happy to work with you. Messaging me [LW · GW] may also net you an invitation to the MIRIx Discord server.
Constructing the reals from first principles was profoundly enjoyable. Working through this book gave me a sense of certitude when dealing with math. Numbers are no longer simply familiar friends following familiar rules, but rather objects I know how to construct. It feels great.
Gather round, gather round - for it is on this blessed morn/day/evening that I recount my youthful dalliances with uncountable infinities!
In my first term of college, I read Gödel, Escher, Bach as part of a wonderful tutorial class. I came across sizes of infinities, and, like many, my mind absolutely refused. However, I wanted to understand what was really going on so intensely that I spent many hours working through the intuitions on my own (not knowing much of anything about formal mathematics). This period of discovery was one of my favorite experiences of my undergraduate career; I can't tell you how many nights I just sat under the stars, thinking. Eventually, I came to the conclusion that of course there are multiple sizes of infinities.
Now, maybe it would have been faster to just learn the math behind diagonalization or some other method of proof, but I think there was tremendous value in learning to fall in love with the process - to commit yourself fully to the joy of discovery and thought.
I can certainly tell you that I wouldn't have made it so far so quickly down the research list if this journey didn't feel like one of the most beautiful things I've ever done.
8 comments
Comments sorted by top scores.
comment by daozaich · 2018-06-02T13:41:30.245Z · LW(p) · GW(p)
What was initially counterintuitive is that even though , the series doesn't converge.
This becomes much less counterintuitive if you instead ask: How would you construct a sequence with divergent series?
Obviously, take a divergent series, e.g. , and then split the th term into .
comment by Dacyn · 2018-06-02T11:51:48.738Z · LW(p) · GW(p)
Technically the Peano axioms don't define addition and multiplication, they only provide a framework for analyzing them.
The sentence "Using the axiom of foundation [...]" seems to me to be poorly worded; the subsequent clause does not follow from the axiom of foundation, but rather contradicts it (enabling one to complete a proof by contradiction).
In your upper bounded monotonic sequences proof you are using to mean two different things. Also, the existence of in the second sentence doesn't follow, because could be less than for all . It looks like there is a step missing in the argument somewhere. Also, I don't see where you have used the completeness axiom, and you certainly can't prove the result without using completeness.
Typo: should be
The sentence about Lipschitz continuity should probably be worded so as to make clear that this is the definition of Lipschitz continuity, and not just a property of it.
There is a much easier example of a function that is uniformly continuous but not Lipschitz continuous: .
The formula is wrong, it should be .
The assumption that and are differentiable on is too strong for L'Hôpital's rule, you don't want to assume that they are differentiable at the limit point . Many applications involve the derivative at being either zero or infinity. Also, I am not sure what you are trying to say about .
Replies from: TurnTrout↑ comment by TurnTrout · 2018-06-02T14:07:39.216Z · LW(p) · GW(p)
Technically the Peano axioms don't define addition and multiplication, they only provide a framework for analyzing them.
I know. What I wrote does not particularly imply that they define addition and multiplication: "allowing us to define addition and multiplication".
In your upper bounded monotonic sequences proof you are using to mean two different things. Also, the existence of in the second sentence doesn't follow, because could be less than for all .
Good catch - I stayed up pretty late writing the review, and I suppose I should watch out for that next time! However, you can still repair the proof by setting the upper bound to that and repeating up to times, since it can only escape upwards (or refine its upper bound downwards) by a finite number of times.
It looks like there is a step missing in the argument somewhere. Also, I don't see where you have used the completeness axiom, and you certainly can't prove the result without using completeness.
And why is that? The proof I provided (with the typo fixed and the step filled in) seems sufficient?
There is a much easier example of a function that is uniformly continuous but not Lipschitz continuous.
I found it less interesting.
The assumption that and are differentiable on is too strong for L'Hôpital's rule
I'd like to say that I literally took the definition from the book here, but that's good to know!
I am not sure what you are trying to say about .
It's a pop culture reference, related to the GIF I had just used. A little bit of trivia.
Replies from: Dacyn↑ comment by Dacyn · 2018-06-02T16:25:25.724Z · LW(p) · GW(p)
The Peano axioms aren't what allows you to define addition and multiplication, though. For example, the induction axiom bears no relation to the definitions of addition and multiplication. (Some of the Peano axioms follow directly from the definitions of addition and multiplication, but really it is the axioms that follow from the definitions and not the other way around.)
You now appear to be proving that for all , there exists such that the sequence is eventually -close to . Whereas to prove that the sequence converges, you would need to show that there exists an such that for all , the sequence is eventually -close to .
Regarding , perhaps I am still not getting the joke, but what I meant was that I don't think the rule "if and are not differentiable on , then perhaps the limit does not exist" is the rule that you would use to determine that the limit does not exist in the example from the movie, since and are in fact differentiable on in that example.
Replies from: TurnTrout↑ comment by TurnTrout · 2018-06-02T16:41:15.383Z · LW(p) · GW(p)
The Peano axioms aren't what allows you to define addition and multiplication, though. For example, the induction axiom bears no relation to the definitions of addition and multiplication. (Some of the Peano axioms follow directly from the definitions of addition and multiplication, but really it is the axioms that follow from the definitions and not the other way around.)
I don't really view what I wrote as saying "you can't formalize addition and multiplication without (all of) these specific axioms". All I was saying is that these axioms provide a framework by which addition and multiplication can be coherently defined, and with which we can prove properties. Sure, the induction axiom is not related, but I don't see how you can make the same case for the axioms defining the successor function for the natural numbers. That is, in this book, we actually used these axioms to define addition and multiplication. Maybe I'm still missing your point!
You now appear to be proving that for all , there exists such that the sequence is eventually -close to . Whereas to prove that the sequence converges, you would need to show that there exists an L such that for all , the sequence is eventually -close to .
But I fixed before doing anything with ?
Regarding ...
I see how that's confusing! I was saying that L'Hôpital's rule is what you would use - not the sentence you had in mind. I'll clear that up.
Replies from: Dacyn↑ comment by Dacyn · 2018-06-02T20:42:54.231Z · LW(p) · GW(p)
Regarding Peano arithmetic, my point is that S(a)+b := S(a+b) (for example) is not the same thing as S(a)+b = S(a+b). The former is a definition, whereas the latter is an statement (which can be used as an axiom). In order to make sense of the definition you need to understand the concept of recursion, whereas you can make sense of the statement just by thinking of addition as "some binary operation that we don't know what it is, but this is a fact we know about it". (Let me reiterate that this is a rather technical distinction, I don't think it makes too much difference either way but I thought it was worth pointing out.)
So, your proof is actually kind of confusingly worded and I am not sure exactly what you mean by "repeating the above argument times". It looks like you want to keep the same but use a different each time. But you can't do that if you fix before introducing .
Another way to look at it is to consider the same proof but with all variables assumed to be
rational. In that case the conclusion of the theorem would be false (because there are bounded monotonic sequences of rationals that don't converge to any rational) and so you should try to figure out which step of your proof would be wrong in the new context. Ordinarily, it should be where you use the completeness axiom, because the completeness axiom doesn't hold for the rationals.
↑ comment by TurnTrout · 2018-06-03T01:14:08.331Z · LW(p) · GW(p)
Regarding Peano arithmetic, my point is that S(a)+b := S(a+b) (for example) is not the same thing as S(a)+b = S(a+b). The former is a definition, whereas the latter is an statement (which can be used as an axiom). In order to make sense of the definition you need to understand the concept of recursion, whereas you can make sense of the statement just by thinking of addition as "some binary operation that we don't know what it is, but this is a fact we know about it". (Let me reiterate that this is a rather technical distinction, I don't think it makes too much difference either way but I thought it was worth pointing out.)
Especially in math, it's great to have concepts or diction which I think "clearly have the same shade of connotation" (but actually don't) brought to my attention. I understand the difference you're pointing to here, and I'll be more cognizant of it in the future. Precision of thought and speech is important, so thanks!
So, your proof is actually kind of confusingly worded...
That's a good point, and one I realized soon after leaving my computer - doesn't stay fixed in my argument. I think that one should still be able to show that all these different 's have to be the same, right? That is (this is just an outline), take any distinct ; then suppose for distinct , the sequence is eventually -close to and -close to . Then we have that . Since are arbitrary, we can expand this argument to an arbitrary number of arbitrarily small 's to show that all the 's must be equal.
My intuition is just so strong on this one, and I want to see whether I can extend my proof to cover it, or whether I actually need to just change approaches.
Replies from: Dacyn↑ comment by Dacyn · 2018-06-03T07:52:29.579Z · LW(p) · GW(p)
So, the way that I understand your argument, ranges over the values . (Of course there is a circular dependency here but as far as I can tell this was what you were thinking.) Of course, all of these possible values of are clearly distinct. I'm not really sure what your point is about what happens if you assume that a sequence is eventually close to two different numbers. I think part of the issue is that your proof is a proof by contradiction, but in a proof by contradiction you are only allowed to introduce the contradiction hypothesis once, not an arbitrary number of times.
The fundamental question that any proof of this theorem should answer is: given a bounded monotonic sequence, how do we find the that it converges to? Your proof doesn't appear to address this question at all, which makes me think you need a completely different idea.
Note: to address your "strong intuition" perhaps I should say that of course if you have two real numbers and such that for all , there is a number that is -close to and -close to , then . But I don't think this fact has the significance you want it to have in the broader argument.