- #1
atferrari
- 8
- 0
Hola, my first time in this forum.
Two microphones at diferent distances from an audio source are receiving the same steady signal (frequency/amplitude unchanged).
A microprocessor takes succesive samples of both channels in batches of N samples. Time between samples is ts microseconds.
Once a batch is ready I want the micro to calculate the time difference (lag) between the incoming signals.
I know I have to use autocorrelation here (cross correlation of itself, right?).
In spite of reading a lot I could not conclude what is the formula to be solved thus how to program the micro to calculate the lag between both channels.
Can anyone tell:
The actual formula to be implemented?.
What are the succesive steps required to process the data set just acquired?
Normalization, is it needed? If so, how do you actually do it and when?
I am not into high level maths so bear with me. Math notation is sometimes quite hard for me to follow.
Gracias for any help.
Agustín Tomás
Two microphones at diferent distances from an audio source are receiving the same steady signal (frequency/amplitude unchanged).
A microprocessor takes succesive samples of both channels in batches of N samples. Time between samples is ts microseconds.
Once a batch is ready I want the micro to calculate the time difference (lag) between the incoming signals.
I know I have to use autocorrelation here (cross correlation of itself, right?).
In spite of reading a lot I could not conclude what is the formula to be solved thus how to program the micro to calculate the lag between both channels.
Can anyone tell:
The actual formula to be implemented?.
What are the succesive steps required to process the data set just acquired?
Normalization, is it needed? If so, how do you actually do it and when?
I am not into high level maths so bear with me. Math notation is sometimes quite hard for me to follow.
Gracias for any help.
Agustín Tomás