Cross-covariance matrix

1

In probability theory and statistics, a cross-covariance matrix is a matrix whose element in the i, j position is the covariance between the i-th element of a random vector and j-th element of another random vector. A random vector is a random variable with multiple dimensions. Each element of the vector is a scalar random variable. Each element has either a finite number of observed empirical values or a finite or infinite number of potential values. The potential values are specified by a theoretical joint probability distribution. Intuitively, the cross-covariance matrix generalizes the notion of covariance to multiple dimensions. The cross-covariance matrix of two random vectors \mathbf{X} and \mathbf{Y} is typically denoted by or.

Definition

For random vectors \mathbf{X} and \mathbf{Y}, each containing random elements whose expected value and variance exist, the cross-covariance matrix of \mathbf{X} and \mathbf{Y} is defined by where and are vectors containing the expected values of \mathbf{X} and \mathbf{Y}. The vectors \mathbf{X} and \mathbf{Y} need not have the same dimension, and either might be a scalar value. The cross-covariance matrix is the matrix whose (i,j) entry is the covariance between the i-th element of \mathbf{X} and the j-th element of \mathbf{Y}. This gives the following component-wise definition of the cross-covariance matrix.

Example

For example, if and are random vectors, then is a 3 \times 2 matrix whose (i,j)-th entry is.

Properties

For the cross-covariance matrix, the following basic properties apply: where \mathbf{X}, and are random p \times 1 vectors, \mathbf{Y} is a random q \times 1 vector, \mathbf{a} is a q \times 1 vector, \mathbf{b} is a p \times 1 vector, A and B are q \times p matrices of constants, and is a p \times q matrix of zeroes.

Definition for complex random vectors

If \mathbf{Z} and \mathbf{W} are complex random vectors, the definition of the cross-covariance matrix is slightly changed. Transposition is replaced by Hermitian transposition: For complex random vectors, another matrix called the pseudo-cross-covariance matrix is defined as follows:

Uncorrelatedness

Two random vectors \mathbf{X} and \mathbf{Y} are called uncorrelated if their cross-covariance matrix matrix is a zero matrix. Complex random vectors \mathbf{Z} and \mathbf{W} are called uncorrelated if their covariance matrix and pseudo-covariance matrix is zero, i.e. if.

This article is derived from Wikipedia and licensed under CC BY-SA 4.0. View the original article.

Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc.
Bliptext is not affiliated with or endorsed by Wikipedia or the Wikimedia Foundation.

Edit article