03-16-2010 12:49 PM
Hello,
Is there a limit to how fast a timed loop can run with an accurate timestamp? I would like to sample at *exactly* every 200 msec, however my output file indicates that it is ~ 200 +/- 10. Is there a better way to create my timestamp? BTW - the Init.vi notes the time in seconds since 1/1/2000 and all subsequent time is relative to the start time. THis is a desired output format that I cannot change. Thanks.
03-16-2010 01:01 PM
Why are you using a timed loop?
In order to sample at precisely 200msec intervals, I suggest you use hardware timing and NOT software timing. The onboard clocks on the DAQ boards are much, much more accurate than anything you can do in software, timing-wise.
What I would do is get rid of the timed loop, use hardware timing for the DAQ task, and collect the data as a waveform. The waveform datatype contains timing information. Look at some of the examples that ship with LabVIEW. "Find Examples"..."Hardware input and output"..."DAQmx"..."Analog Measurements".
Your timestamp is only accurate to ~16msec because it obtains the time from the Windows clock, and that's how accurate the Windows clock is. That's the best it's ever going to do, so if you need better than that, use hardware timing for your DAQ.
Hope that helps a bit...
d
03-16-2010 01:29 PM
The op is not using NI DAQ hardware (Measurement Computing?).
nousername,
DianeS is correct about the jitter in the windows clock. You really need to check with the vendor about hardware timed scanning. If the board does not support it, you are going to have to live with that jitter (or get a different DAQ card).
03-16-2010 06:30 PM
Thanks everyone,
Yes, it is for a D/A using Measurement computing VIs. I suppose that I can live with the jitter; I was just making sure that my implementation of their VI's made sense. Ive not used LabVIEW in a long time, so any criticism of the VIs that I attached above is welcome. Thanks!
03-16-2010 06:42 PM