Counter/Timer

cancel
Showing results for 
Search instead for 
Did you mean: 

time sampling rate

I have a doubt about the recorded data by NI cDAQ 9234 (data obtained by accelerometer transducers). I am recording data with Rate (Hz) 2048 and samples to read 1024 and acquisition mode : continuous samples.
 
Now, to undestand the time between two samples, that it is equal to 1/2048= 0.00048828125, but in the file obtained by "write to measurement" the sampling rate is 0.000488 or 0.000489 and it changes in a random way ( I think that it is normal because the software have to do an approximation because it is not possible to have an integer real sampling rate).
 
My doubt is if there is a way to correct this random way to changing values, and more important, if the software works with the correct value (therefore 0.00048828125) but the problem is only the data in the time stamp.
Finally, if there is a guideline where I can study in depth these problems.
 
I am attaching a jpeg about the problem.
 
Thanks a lot for the availability and the help.
 
0 Kudos
Message 1 of 2
(3,701 Views)

1. For the record, it won't be truly "random" whether the delta time shows as 0.000488 or 0.000489.  There will be a regular and repeatable pattern based on the specifics of the quantization error inherent in the sample clock rate *and* the precision and rounding involved in the "write to measurement file".  

 

2. Your device is constrained to have very discrete available *actual* sample rates, regardless of what you request.  When you request something that doesn't match exactly to an available rate, DAQmx will use the adjacent available rate without returning an error.  I expect it will generally go to the next higher available frequency because that's usually a safer choice than going to a lower rate.

 

Page 6 of your device's spec sheet shows that you will only be able to get integer divisors of  51.2 kHz.  And, oh hey!  Looks like 2048 Hz is in fact an integer divisor, so you're already on the right track there after all.

 

So I would say that yes, your device is sampling at the 2048 Hz you requested and that the actual time between samples is in fact 0.00048828125 sec (give or take the accuracy of the oscillator).  So it's probably just the "write to measurement file" function that's rounding a cumulative time total.  I've never used that function, but I suspect there's a way to specify your need for greater precision in the stored data.

 

 

-Kevin P

 

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 2 of 2
(3,608 Views)