Contents
Bregman divergence
In mathematics, specifically statistics and information geometry, a Bregman divergence or Bregman distance is a measure of difference between two points, defined in terms of a strictly convex function; they form an important class of divergences. When the points are interpreted as probability distributions – notably as either values of the parameter of a parametric model or as a data set of observed values – the resulting distance is a statistical distance. The most basic Bregman divergence is the squared Euclidean distance. Bregman divergences are similar to metrics, but satisfy neither the triangle inequality (ever) nor symmetry (in general). However, they satisfy a generalization of the Pythagorean theorem, and in information geometry the corresponding statistical manifold is interpreted as a (dually) flat manifold. This allows many techniques of optimization theory to be generalized to Bregman divergences, geometrically as generalizations of least squares. Bregman divergences are named after Russian mathematician Lev M. Bregman, who introduced the concept in 1967.
Definition
Let be a continuously-differentiable, strictly convex function defined on a convex set \Omega. The Bregman distance associated with F for points is the difference between the value of F at point p and the value of the first-order Taylor expansion of F around point q evaluated at point p:
Properties
For any p,q,z . Then For any , This is an equality if P_W(v) is in the relative interior of W. In particular, this always happens when W is an affine set.
Proofs
Fix x\in X. Take affine transform on f, so that. Take some, such that. Then consider the "radial-directional" derivative of f on the Euclidean sphere. for all. Since is compact, it achieves minimal value \delta at some. Since f is strictly convex, \delta > 0. Then. Since D_f(y, x) is C^1 in y, D_f is continuous in y, thus B_f(x, r) is closed if X is. Fix v\in X. Take some w\in W, then let. Then draw the Bregman ball. It is closed and bounded, thus compact. Since is continuous and strictly convex on it, and bounded below by 0, it achieves a unique minimum on it. By cosine law,, which must be \geq 0, since P_W(v) minimizes in W, and W is convex. If, then since w is in the relative interior, we can move from P_W(v) in the direction opposite of w, to decrease D_f(y, v) , contradiction. Thus.
Classification theorems
The following two characterizations are for divergences on \Gamma_n, the set of all probability measures on, with n \geq 2. Define a divergence on \Gamma_n as any function of type, such that D(x, x) = 0 for all , then: Given a Bregman divergence D_F, its "opposite", defined by, is generally not a Bregman divergence. For example, the Kullback-Leiber divergence is both a Bregman divergence and an f-divergence. Its reverse is also an f-divergence, but by the above characterization, the reverse KL divergence cannot be a Bregman divergence.
Examples
Generalizing projective duality
A key tool in computational geometry is the idea of projective duality, which maps points to hyperplanes and vice versa, while preserving incidence and above-below relationships. There are numerous analytical forms of the projective dual: one common form maps the point to the hyperplane. This mapping can be interpreted (identifying the hyperplane with its normal) as the convex conjugate mapping that takes the point p to its dual point, where F defines the d-dimensional paraboloid. If we now replace the paraboloid by an arbitrary convex function, we obtain a different dual mapping that retains the incidence and above-below properties of the standard projective dual. This implies that natural dual concepts in computational geometry like Voronoi diagrams and Delaunay triangulations retain their meaning in distance spaces defined by an arbitrary Bregman divergence. Thus, algorithms from "normal" geometry extend directly to these spaces (Boissonnat, Nielsen and Nock, 2010)
Generalization of Bregman divergences
Bregman divergences can be interpreted as limit cases of skewed Jensen divergences (see Nielsen and Boltz, 2011). Jensen divergences can be generalized using comparative convexity, and limit cases of these skewed Jensen divergences generalizations yields generalized Bregman divergence (see Nielsen and Nock, 2017). The Bregman chord divergence is obtained by taking a chord instead of a tangent line.
Bregman divergence on other objects
Bregman divergences can also be defined between matrices, between functions, and between measures (distributions). Bregman divergences between matrices include the Stein's loss and von Neumann entropy. Bregman divergences between functions include total squared error, relative entropy, and squared bias; see the references by Frigyik et al. below for definitions and properties. Similarly Bregman divergences have also been defined over sets, through a submodular set function which is known as the discrete analog of a convex function. The submodular Bregman divergences subsume a number of discrete distance measures, like the Hamming distance, precision and recall, mutual information and some other set based distance measures (see Iyer & Bilmes, 2012 for more details and properties of the submodular Bregman.) For a list of common matrix Bregman divergences, see Table 15.1 in.
Applications
In machine learning, Bregman divergences are used to calculate the bi-tempered logistic loss, performing better than the softmax function with noisy datasets. Bregman divergence is used in the formulation of mirror descent, which includes optimization algorithms used in machine learning such as gradient descent and the hedge algorithm.
This article is derived from Wikipedia and licensed under CC BY-SA 4.0. View the original article.
Wikipedia® is a registered trademark of the
Wikimedia Foundation, Inc.
Bliptext is not
affiliated with or endorsed by Wikipedia or the
Wikimedia Foundation.