Dominated convergence theorem

In measure theory, Lebesgue's dominated convergence theorem gives a mild sufficient condition under which limits and integrals of a sequence of functions can be interchanged. More technically it says that if a sequence of functions is bounded in absolute value by an integrable function and is almost everywhere point wise convergent to a function then the sequence convergences in to its point wise limit, and in particular the integral of the limit is the limit of the integrals. Its power and utility are two of the primary theoretical advantages of Lebesgue integration over Riemann integration.

In addition to its frequent appearance in mathematical analysis and partial differential equations, it is widely used in probability theory, since it gives a sufficient condition for the convergence of expected values of random variables.


© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search