Cross-correlation normalization

Good time of day! I'm trying to write a program to find a pattern in a signal. The signal is intermittent, quasi-stationary. The task is to get an adequate correlation coefficient (KK). Before the main calculations, I remove the average from the samples:

x = x - mean(x)

To calculate the QC, I use the FFT-based approach:

corr(x, y) = F'( F(x) * conj(F(y)) ),

Where F () is the forward FFT, F' () is the inverse FFT, and conj () is the complex conjugate. In order for the FFT to work I use the addition of zeros the source arrays x and y up to the power of 2 are as follows

[000000xxxxxxx000]
[yyyyyyy000000000]

After executing the corr() procedure, I get a cross-correlation function, the maximum of which seems to be the desired KK, and the position of the maximum corresponds to the shift of the pattern in the signal. However, to get the KK, the resulting maximum must be normalized(to get a value in the range from -1 to 1) The question is actually how to properly normalize the resulting maximum?

P.S. if necessary, I can give the source code code.

 3
Author: progzdeveloper, 2013-10-26

1 answers

Hmm. IronVbif, but you seem to be right. The system is discrete.You can either apply the Pearson criterion to the initial problem (and get the result right away). And you can also get a normalized correlation by dividing the resulting function by the square root of the variance: normcov=f(x)/(sqrt((sum((f(x) - mean(x))^2)/n)). That seems to be the answer.

 2
Author: bear11, 2013-10-28 13:59:20