05-31-2017 10:43 AM
Hi,
I am trying to compute the autocorrelation function of a set of data from scattered photons in real time, but only at selected delay times, over a time period of 100ms.The formula I am using is: g2(Dn = t fs) =<n(i)n(i+Dn)>/<n(i)><n(i)>, where
n(i) is the instantaneous photon count in one bin, N(i) is the number of counts that have been stored till the ith sample,t is delay time,
<> is time average over navg = nint – Dn,
n(i) = N(i+1)-N(i) ,
nint = fs/tint ,
(fs is the sampling frequency, and tint is the integration time, i.e. 100ms),
My goal is to compute the autocorrelation function at selected delay times or tau values, over this integration time. For example I want to calculate the correlation at 40 delay points, which thus should give me 40 autocorrelation points and average it over the integration time and save the points and display the curve in real time. And to keep doing it continuously for successive integration time.
Is it possible to design a VI in LabVIEW for this? If so, how can I achieve this? Can somebody provide me with a template?
I am trying LabVIEW for the first time and am new to all this. I will be grateful if anyone can help me with this. So far I have been able to gather the photon counts.
Thanks
05-31-2017 01:34 PM
Hey, I don't know if this will be helpful or not, but have you tried out the autocorrelation VI in LabVIEW?
https://zone.ni.com/reference/en-XX/help/371361G-01/lvanls/autocorrelation/
Also, sounds like you need to keep a running average of the last 40 points or so, check out this example:
http://www.ni.com/example/30229/en/