It took a while for me to grasp that epsilon-delta proofs are about control. We want to control the output of a function by restricting the input. It's about whether bad behavior can be excluded by shrinking neighborhoods.
In an proof, represents how much error is allowed in the output, and represents how tightly we restrict the input.
Proving Existence of a Limit
is fixed first. It's arbitrary. is chosen in response to , and we must handle all in the -neighborhood.
Proving a Limit Doesn't Exist
is arbitrary, is chosen, is arbitrary, and is chosen to witness failure. Specifically, by "choosing an " means to assert that the function fails by at least this much, no matter how close you go.
On Restricting Neighborhoods and Boundaries
In a continuity proof you often need two different conditions to be true at the same time:
- A “geometry/safety” condition (trap in a nice region): Keep denominators away from 0, keep positive, keep derivatives bounded, etc.
- The actual -condition: This is usually a simple inequality once the region is safe.
Each condition gives you an upper bound on how small must be:
-
Condition 1 says “choose ”
-
Condition 2 says “choose ”
To satisfy both, you take You typically pick a convenient requirement like whose only purpose is to force to be in a region where you can bound the ugly term by a constant. This is not guessing but building a controlled local universe around .
On the Structure of Logical Quantifiers
When we are given: this literally means:
-
Someone gives you any . So is chosen first.
-
You must respond with a . So can depend on ε.
-
Then nature chooses any satisfying . is chosen after so cannot depend on .
-
The inequality must hold.
It boils down to logic: A variable can depend on everything quantified before it. It cannot depend on anything quantified after it. Fundamentally, in real analysis we study different notions of continuity to express different notions of strictness; i.e., control of what can depend on what, and what cannot depend on what.