Ever since I learned elementary calculus, I have been wondering about the proof of the unique self-differentiation of , a constant. One of the proofs below I was led to by a calculus textbook; another I figured out by myself.
: Using the Mean Value Theorem
Let be a function so that . Define a new function
Now, find :
But our base assumption is that , or that . In other words,
Now, the Mean Value Theorem implies that only constant functions have zero derivative. This means that for some constant . Plugging this into yields:
: Using the Taylor Expansion of
Note: I would like to thank Alexander Gibson for pointing out that this proof only applies to the set of analytic functions.
Once again, let be a function so that . Notice that this implies the following:
and so forth. We can also say that
for some constant . Use the theory of Taylor series to expand :
for around any point . Inputting our initial assumption into the Taylor expansion of yields:
Divide both sides by :
Notice that the right hand side is the Taylor expansion of about a point . We can conclude that
Note: It was okay to assume that the Taylor series converges, because it always does in a small interval about .
Whew! That was a lot of theory. If you want to share your own proof of the unique self-differentiation property of , then share it down below in the comments.