Contents
Unconditional convergence
In mathematics, specifically functional analysis, a series is unconditionally convergent if all reorderings of the series converge to the same value. In contrast, a series is conditionally convergent if it converges but different orderings do not all converge to that same value. Unconditional convergence is equivalent to absolute convergence in finite-dimensional vector spaces, but is a weaker property in infinite dimensions.
Definition
Let X be a topological vector space. Let I be an index set and x_i \in X for all i \in I. The series is called unconditionally convergent to x \in X, if
Alternative definition
Unconditional convergence is often defined in an equivalent way: A series is unconditionally convergent if for every sequence with the series converges. If X is a Banach space, every absolutely convergent series is unconditionally convergent, but the converse implication does not hold in general. Indeed, if X is an infinite-dimensional Banach space, then by Dvoretzky–Rogers theorem there always exists an unconditionally convergent series in this space that is not absolutely convergent. However, when X = \R^n, by the Riemann series theorem, the series \sum_n x_n is unconditionally convergent if and only if it is absolutely convergent.
This article is derived from Wikipedia and licensed under CC BY-SA 4.0. View the original article.
Wikipedia® is a registered trademark of the
Wikimedia Foundation, Inc.
Bliptext is not
affiliated with or endorsed by Wikipedia or the
Wikimedia Foundation.