As with so many others, I'm fighting a losing battle of timer
issues. I created a small test vi to see if I could use the
method Lynn described--a gif of the block diagram is included as
attachment labview3.gif.
After running this for about 16 minutes I find that the timestamps
given by "Get Date/Time in Seconds" are about 6 msec. behind the method
of getting one Date/Time stamp before the loop starts and using the
millisecond "Tick Count" to measure the time. This is using
LabVIEW 5.0 on a WIN '98 machine with several processes running.
A copy of the front panel is included as labview2.gif.
I created a similar vi using LV 7.1 on an XP machine with few processes
running and discovered a discrepancy of about 7 milliseconds after a 3
hour run time. My question is, which clock is correct? Is
the system clock losing time or is the millesecond timer gaining time
or is it a little bit of both?
(I'm still working on a way to used this combined clock for time stamping even after the millisecond clock rollover.)