Article 53V8S Fundamental theorem of calculus generalized

Fundamental theorem of calculus generalized

by
John
from John D. Cook on (#53V8S)

The first fundamental theorem of calculus says that integration undoes differentiation.

The second fundamental theorem of calculus says that differentiation undoes integration.

This post looks at the fine print of these two theorems, in their basic forms and when generalized to Lebesgue integration.

Second fundamental theorem of calculus

We'll start with the second fundamental theorem because it's simpler. In it's basic form, it says that if f is a continuous function on an open interval I, and a is a point in I, then the function F defined by

ftc2.svg

is an antiderivative for f on the interval I, i.e.

antiderivative.svg

for all x in I. In that sense differentiation undoes integration.

If we remove the requirement that f be continuous, we still have F = f almost everywhere as long as f is absolutely integrable, i.e. the integral of |f| over I is finite. In more detail,

antiderivative.svg

at every Lebesgue point x, i.e. every point x that satisfies

lebesgue_point.svg

First fundamental theorem of calculus

The first fundamental theorem of calculus says that if the derivative of F is f and f is continuous on an interval [a, b], then

ftc1.svg

So if F has a continuous derivative, then integration undoes differentiation. What if F is continuous and but differentiable at almost every point rather than at every point? Then the theorem doesn't necessarily hold. But the theorem does hold if we require F to be absolutely continuous rather than just continuous.

A function is absolutely continuous if it maps sets of measure zero to sets of measure zero. It's not easy to imagine continuous functions that are not absolutely continuous, but Cantor's function, a.k.a. the Devil's staircase, takes the Cantor set, a set of measure zero, to a set of measure one.

The usual definition of absolute continuity, equivalent to the one above, takes the ordinary definition of continuity and chops into n pieces. That is, for every > 0 and for every n, there exists a > 0 such that for any collection of n intervals of total length less than , the sum of the variation in f over all the intervals is less than . If n = 1 this is the definition of uniform continuity, so absolute continuity is a more demanding criterion than uniform continuity.

Mj7Zi_BYkLU
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/TheEndeavour?format=xml
Feed Title John D. Cook
Feed Link https://www.johndcook.com/blog
Reply 0 comments