language-iconOld Web
English
Sign In

Cross-correlation

Correlation and covariance of random vectorsCorrelation and covariance of stochastic processesCorrelation and covariance of deterministic signalsIn signal processing, cross-correlation is a measure of similarity of two series as a function of the displacement of one relative to the other. This is also known as a sliding dot product or sliding inner-product. It is commonly used for searching a long signal for a shorter, known feature. It has applications in pattern recognition, single particle analysis, electron tomography, averaging, cryptanalysis, and neurophysiology. The cross-correlation is similar in nature to the convolution of two functions. In an autocorrelation, which is the cross-correlation of a signal with itself, there will always be a peak at a lag of zero, and its size will be the signal energy. ( f ⋆ g ) ( τ ) ≜ ∫ − ∞ ∞ f ( t ) ¯ g ( t + τ ) d t {displaystyle (fstar g)( au ) riangleq int _{-infty }^{infty }{overline {f(t)}}g(t+ au ),dt} (Eq.1) ( f ⋆ g ) [ n ] ≜ ∑ m = − ∞ ∞ f [ m ] ¯ g [ m + n ] {displaystyle (fstar g) riangleq sum _{m=-infty }^{infty }{overline {f}}g} (Eq.2) R X Y ≜ E ⁡ [ X Y T ] {displaystyle operatorname {R} _{mathbf {X} mathbf {Y} } riangleq operatorname {E} } (Eq.3) R X Y ⁡ ( t 1 , t 2 ) = E ⁡ [ X t 1 Y t 2 ¯ ] {displaystyle operatorname {R} _{XY}(t_{1},t_{2})=operatorname {E} } (Eq.4) K X Y ⁡ ( t 1 , t 2 ) = E ⁡ [ ( X t 1 − μ X ( t 1 ) ) ( Y t 2 − μ Y ( t 2 ) ) ¯ ] {displaystyle operatorname {K} _{XY}(t_{1},t_{2})=operatorname {E} } (Eq.5) R X Y ⁡ ( τ ) = E ⁡ [ X t Y t + τ ¯ ] {displaystyle operatorname {R} _{XY}( au )=operatorname {E} left} (Eq.6) K X Y ⁡ ( τ ) = E ⁡ [ ( X t − μ X ) ( Y t + τ − μ Y ) ¯ ] {displaystyle operatorname {K} _{XY}( au )=operatorname {E} left} (Eq.7) In signal processing, cross-correlation is a measure of similarity of two series as a function of the displacement of one relative to the other. This is also known as a sliding dot product or sliding inner-product. It is commonly used for searching a long signal for a shorter, known feature. It has applications in pattern recognition, single particle analysis, electron tomography, averaging, cryptanalysis, and neurophysiology. The cross-correlation is similar in nature to the convolution of two functions. In an autocorrelation, which is the cross-correlation of a signal with itself, there will always be a peak at a lag of zero, and its size will be the signal energy. In probability and statistics, the term cross-correlations refers to the correlations between the entries of two random vectors X {displaystyle mathbf {X} } and Y {displaystyle mathbf {Y} } , while the correlations of a random vector X {displaystyle mathbf {X} } are the correlations between the entries of X {displaystyle mathbf {X} } itself, those forming the correlation matrix of X {displaystyle mathbf {X} } . If each of X {displaystyle mathbf {X} } and Y {displaystyle mathbf {Y} } is a scalar random variable which is realized repeatedly in a time series, then the correlations of the various temporal instances of X {displaystyle mathbf {X} } are known as autocorrelations of X {displaystyle mathbf {X} } , and the cross-correlations of X {displaystyle mathbf {X} } with Y {displaystyle mathbf {Y} } across time are temporal cross-correlations. In probability and statistics, the definition of correlation always includes a standardising factor in such a way that correlations have values between −1 and +1. If X {displaystyle X} and Y {displaystyle Y} are two independent random variables with probability density functions f {displaystyle f} and g {displaystyle g} , respectively, then the probability density of the difference Y − X {displaystyle Y-X} is formally given by the cross-correlation (in the signal-processing sense) f ⋆ g {displaystyle fstar g} ; however, this terminology is not used in probability and statistics. In contrast, the convolution f ∗ g {displaystyle f*g} (equivalent to the cross-correlation of f ( t ) {displaystyle f(t)} and g ( − t ) {displaystyle g(-t)} ) gives the probability density function of the sum X + Y {displaystyle X+Y} . For continuous functions f {displaystyle f} and g {displaystyle g} , the cross-correlation is defined as:

[ "Algorithm", "Electronic engineering", "Statistics", "Optics", "Mathematical analysis", "waveform cross correlation", "cross correlation coefficient", "Welch bounds" ]
Parent Topic
Child Topic
    No Parent Topic
Baidu
map