Cross-covariance: Difference between revisions
Definition fore compex processes. |
Olexa Riznyk (talk | contribs) Clarifying, fixing equation numbers, adding/updating wikilinks, spelling/grammar/punctuation/typographical correction, fixing style/layout errors |
||
(4 intermediate revisions by 3 users not shown) | |||
Line 3: | Line 3: | ||
{{refimprove|date=December 2016}} |
{{refimprove|date=December 2016}} |
||
In [[probability]] and [[statistics]], given two [[stochastic processes]] <math>\left\{X_t\right\}</math> and <math>\left\{Y_t\right\}</math>, the '''cross-covariance''' is a function that gives the [[covariance]] of one process with the other at pairs of time points. With the usual notation <math>\operatorname E</math> |
In [[probability]] and [[statistics]], given two [[stochastic processes]] <math>\left\{X_t\right\}</math> and <math>\left\{Y_t\right\}</math>, the '''cross-covariance''' is a function that gives the [[covariance]] of one process with the other at pairs of time points. With the usual notation <math>\operatorname E</math> for the [[expected value|expectation]] [[Operator (mathematics)|operator]], if the processes have the [[mean]] functions <math>\mu_X(t) = \operatorname \operatorname E[X_t]</math> and <math>\mu_Y(t) = \operatorname E[Y_t]</math>, then the cross-covariance is given by |
||
:<math>\operatorname{K}_{XY}(t_1,t_2) = \operatorname{cov} (X_{t_1}, Y_{t_2}) = \operatorname{E}[(X_{t_1} - \mu_X(t_1))(Y_{t_2} - \mu_Y(t_2))] = \operatorname{E}[X_{t_1} Y_{t_2}] - \mu_X(t_1) \mu_Y(t_2).\,</math> |
:<math>\operatorname{K}_{XY}(t_1,t_2) = \operatorname{cov} (X_{t_1}, Y_{t_2}) = \operatorname{E}[(X_{t_1} - \mu_X(t_1))(Y_{t_2} - \mu_Y(t_2))] = \operatorname{E}[X_{t_1} Y_{t_2}] - \mu_X(t_1) \mu_Y(t_2).\,</math> |
||
Line 17: | Line 17: | ||
==Cross-covariance of stochastic processes== |
==Cross-covariance of stochastic processes== |
||
The definition of cross-covariance of random |
The definition of cross-covariance of random vectors may be generalized to [[stochastic process|stochastic processes]] as follows: |
||
===Definition=== |
===Definition=== |
||
Let <math>\{ X(t) \}</math> and <math>\{ Y(t) \}</math> denote stochastic processes. Then the cross-covariance function of the processes <math>K_{XY}</math> is defined by:<ref name=KunIlPark>Kun Il Park, Fundamentals of Probability and Stochastic Processes with Applications to Communications, Springer, 2018, 978-3-319-68074-3</ref>{{rp|p.172}} |
Let <math>\{ X(t) \}</math> and <math>\{ Y(t) \}</math> denote stochastic processes. Then the cross-covariance function of the processes <math>K_{XY}</math> is defined by:<ref name=KunIlPark>Kun Il Park, Fundamentals of Probability and Stochastic Processes with Applications to Communications, Springer, 2018, 978-3-319-68074-3</ref>{{rp|p.172}} |
||
{{Equation box 1 |
{{Equation box 1 |
||
|indent = |
|indent = : |
||
|title= |
|title= |
||
|equation = {{NumBlk||<math>\operatorname{K}_{XY}(t_1,t_2) \stackrel{\mathrm{def}}{=}\ \operatorname{cov} (X_{t_1}, Y_{t_2}) = \operatorname{E} \left[ \left( X(t_1)- \mu_X(t_1) \right) \left( Y(t_2)- \mu_Y(t_2) \right) \right]</math>|{{EquationRef|Eq. |
|equation = {{NumBlk||<math>\operatorname{K}_{XY}(t_1,t_2) \stackrel{\mathrm{def}}{=}\ \operatorname{cov} (X_{t_1}, Y_{t_2}) = \operatorname{E} \left[ \left( X(t_1)- \mu_X(t_1) \right) \left( Y(t_2)- \mu_Y(t_2) \right) \right]</math>|{{EquationRef|Eq.1}}}} |
||
|cellpadding= 6 |
|cellpadding= 6 |
||
|border |
|border |
||
Line 32: | Line 32: | ||
where <math>\mu_X(t) = \operatorname{E}\left[X(t)\right]</math> and <math>\mu_Y(t) = \operatorname{E}\left[Y(t)\right]</math>. |
where <math>\mu_X(t) = \operatorname{E}\left[X(t)\right]</math> and <math>\mu_Y(t) = \operatorname{E}\left[Y(t)\right]</math>. |
||
If the processes are complex stochastic processes, the second factor needs to be complex |
If the processes are [[complex-valued]] stochastic processes, the second factor needs to be [[complex conjugate]]d: |
||
:<math>\operatorname{K}_{XY}(t_1,t_2) \stackrel{\mathrm{def}}{=}\ \operatorname{cov} (X_{t_1}, Y_{t_2}) = \operatorname{E} \left[ \left( X(t_1)- \mu_X(t_1) \right) \overline{\left( Y(t_2)- \mu_Y(t_2) \right)} \right]</math> |
:<math>\operatorname{K}_{XY}(t_1,t_2) \stackrel{\mathrm{def}}{=}\ \operatorname{cov} (X_{t_1}, Y_{t_2}) = \operatorname{E} \left[ \left( X(t_1)- \mu_X(t_1) \right) \overline{\left( Y(t_2)- \mu_Y(t_2) \right)} \right]</math> |
||
===Definition for jointly WSS processes=== |
===Definition for jointly WSS processes=== |
||
⚫ | |||
⚫ | |||
:<math>\mu_X(t_1) = \mu_X(t_2) \triangleq \mu_X</math> for all <math>t_1,t_2</math>, |
:<math>\mu_X(t_1) = \mu_X(t_2) \triangleq \mu_X</math> for all <math>t_1,t_2</math>, |
||
Line 52: | Line 51: | ||
:<math>\operatorname{K}_{XY}(\tau) = \operatorname{K}_{XY}(t_2 - t_1) \triangleq \operatorname{K}_{XY}(t_1,t_2)</math>. |
:<math>\operatorname{K}_{XY}(\tau) = \operatorname{K}_{XY}(t_2 - t_1) \triangleq \operatorname{K}_{XY}(t_1,t_2)</math>. |
||
The |
The cross-covariance function of two jointly WSS processes is therefore given by: |
||
{{Equation box 1 |
{{Equation box 1 |
||
|indent = |
|indent = : |
||
|title= |
|title= |
||
|equation = {{NumBlk||<math>\operatorname{K}_{XY}(\tau) = \operatorname{cov} (X_{t}, Y_{t-\tau}) = \operatorname{E}[(X_t - \mu_X)(Y_{t- \tau} - \mu_Y)] = \operatorname{E}[X_t Y_{t-\tau}] - \mu_X \mu_Y</math>|{{EquationRef|Eq. |
|equation = {{NumBlk||<math>\operatorname{K}_{XY}(\tau) = \operatorname{cov} (X_{t}, Y_{t-\tau}) = \operatorname{E}[(X_t - \mu_X)(Y_{t- \tau} - \mu_Y)] = \operatorname{E}[X_t Y_{t-\tau}] - \mu_X \mu_Y</math>|{{EquationRef|Eq.2}}}} |
||
|cellpadding= 6 |
|cellpadding= 6 |
||
|border |
|border |
||
Line 65: | Line 64: | ||
which is equivalent to |
which is equivalent to |
||
:<math>\operatorname{K}_{ |
:<math>\operatorname{K}_{XY}(\tau) = \operatorname{cov} (X_{t+\tau}, Y_{t}) = \operatorname{E}[(X_{t+ \tau} - \mu_X)(Y_{t} - \mu_Y)] = \operatorname{E}[X_{t+\tau} Y_t] - \mu_X \mu_Y</math>. |
||
===Uncorrelatedness=== |
===Uncorrelatedness=== |
||
Line 75: | Line 74: | ||
The cross-covariance is also relevant in [[signal processing]] where the cross-covariance between two [[wide-sense stationary]] [[random processes]] can be estimated by averaging the product of samples measured from one process and samples measured from the other (and its time shifts). The samples included in the average can be an arbitrary subset of all the samples in the signal (e.g., samples within a finite time window or a [[sampling (statistics)|sub-sampling]] of one of the signals). For a large number of samples, the average converges to the true covariance. |
The cross-covariance is also relevant in [[signal processing]] where the cross-covariance between two [[wide-sense stationary]] [[random processes]] can be estimated by averaging the product of samples measured from one process and samples measured from the other (and its time shifts). The samples included in the average can be an arbitrary subset of all the samples in the signal (e.g., samples within a finite time window or a [[sampling (statistics)|sub-sampling]] of one of the signals). For a large number of samples, the average converges to the true covariance. |
||
Cross-covariance may also refer to a '''"deterministic" cross-covariance''' between two signals. This consists of summing over ''all'' time indices. |
Cross-covariance may also refer to a '''"deterministic" cross-covariance''' between two signals. This consists of summing over ''all'' time indices. For example, for [[discrete-time]] signals <math>f[k]</math> and <math>g[k]</math> the cross-covariance is defined as |
||
For example, for discrete-time signals <math>f[k]</math> and <math>g[k]</math> the cross-covariance is defined as |
|||
:<math>(f\star g)[n] \ \stackrel{\mathrm{def}}{=}\ \sum_{k\in \mathbb{Z}} \overline{f[k]} g[n+k] = \sum_{k\in \mathbb{Z}} \overline{f[k-n]} g[k]</math> |
:<math>(f\star g)[n] \ \stackrel{\mathrm{def}}{=}\ \sum_{k\in \mathbb{Z}} \overline{f[k]} g[n+k] = \sum_{k\in \mathbb{Z}} \overline{f[k-n]} g[k]</math> |
||
Line 82: | Line 80: | ||
where the line indicates that the [[complex conjugate]] is taken when the signals are [[complex-valued]]. |
where the line indicates that the [[complex conjugate]] is taken when the signals are [[complex-valued]]. |
||
For continuous functions <math>f(x)</math> and <math>g(x)</math> the (deterministic) cross-covariance is defined as |
For [[Continuous function|continuous functions]] <math>f(x)</math> and <math>g(x)</math> the (deterministic) cross-covariance is defined as |
||
:<math>(f\star g)(x) \ \stackrel{\mathrm{def}}{=}\ \int \overline{f(t)} g(x+t)\,dt = \int \overline{f(t-x)} g(t)\,dt</math>. |
:<math>(f\star g)(x) \ \stackrel{\mathrm{def}}{=}\ \int \overline{f(t)} g(x+t)\,dt = \int \overline{f(t-x)} g(t)\,dt</math>. |
Latest revision as of 09:00, 20 November 2021
Part of a series on Statistics |
Correlation and covariance |
---|
This article needs additional citations for verification. (December 2016) |
In probability and statistics, given two stochastic processes and , the cross-covariance is a function that gives the covariance of one process with the other at pairs of time points. With the usual notation for the expectation operator, if the processes have the mean functions and , then the cross-covariance is given by
Cross-covariance is related to the more commonly used cross-correlation of the processes in question.
In the case of two random vectors and , the cross-covariance would be a matrix (often denoted ) with entries Thus the term cross-covariance is used in order to distinguish this concept from the covariance of a random vector , which is understood to be the matrix of covariances between the scalar components of itself.
In signal processing, the cross-covariance is often called cross-correlation and is a measure of similarity of two signals, commonly used to find features in an unknown signal by comparing it to a known one. It is a function of the relative time between the signals, is sometimes called the sliding dot product, and has applications in pattern recognition and cryptanalysis.
Cross-covariance of random vectors
[edit]Cross-covariance of stochastic processes
[edit]The definition of cross-covariance of random vectors may be generalized to stochastic processes as follows:
Definition
[edit]Let and denote stochastic processes. Then the cross-covariance function of the processes is defined by:[1]: p.172
(Eq.1) |
where and .
If the processes are complex-valued stochastic processes, the second factor needs to be complex conjugated:
Definition for jointly WSS processes
[edit]If and are a jointly wide-sense stationary, then the following are true:
- for all ,
- for all
and
- for all
By setting (the time lag, or the amount of time by which the signal has been shifted), we may define
- .
The cross-covariance function of two jointly WSS processes is therefore given by:
(Eq.2) |
which is equivalent to
- .
Uncorrelatedness
[edit]Two stochastic processes and are called uncorrelated if their covariance is zero for all times.[1]: p.142 Formally:
- .
Cross-covariance of deterministic signals
[edit]The cross-covariance is also relevant in signal processing where the cross-covariance between two wide-sense stationary random processes can be estimated by averaging the product of samples measured from one process and samples measured from the other (and its time shifts). The samples included in the average can be an arbitrary subset of all the samples in the signal (e.g., samples within a finite time window or a sub-sampling of one of the signals). For a large number of samples, the average converges to the true covariance.
Cross-covariance may also refer to a "deterministic" cross-covariance between two signals. This consists of summing over all time indices. For example, for discrete-time signals and the cross-covariance is defined as
where the line indicates that the complex conjugate is taken when the signals are complex-valued.
For continuous functions and the (deterministic) cross-covariance is defined as
- .
Properties
[edit]The (deterministic) cross-covariance of two continuous signals is related to the convolution by
and the (deterministic) cross-covariance of two discrete-time signals is related to the discrete convolution by
- .