LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Accuracy of the get timestamp function accuracy +/- 16 ms??

  I'm using the "Get Timestamp Function" in my vi to mark time in my file roughly every 30 seconds.  The problem I am seeing is that it seems sometimes it is 16 ms off.  Sooner rather than later.  Do I need to mix this with the get time in seconds function?  I would like consistant values in my file.

 

Thanks,

Chris

 

Message Edited by caunchman on 12-09-2008 03:28 PM
0 Kudos
Message 1 of 8
(3,646 Views)

Hi Chris,

 

this function is based on Windows timing functions and those only update each 16ms... So you cannot use this function to measure time in higher resolution than 16ms!

 

Some more thoughts:

- You have a deviation of 16ms after your mentioned 30s interval: that's an error of only ~0.05%. How accurate is the timing of your PC? How accurate (in terms of exact timing) are other processing steps of your program?

- Maybe Wiebes vi may help you...

Message Edited by GerdW on 12-09-2008 10:42 PM
Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 2 of 8
(3,641 Views)

Thanks for the quick response and insight.....I didn't even think of the fact that I am working off the accuracy of the systems machine.  What gets me though is that it isn't always off.  Take an expample that I am recording at 6000 Hz with a samples to read size at 3000 samples.  The first 90 seconds I'll get good timestamps then the last one will be the 16 ms off. 

   How accurate is the time in relation to the data in the next line I guess is what I'm really asking here?  How accurate do I want or need tobe I guess my boss would say as accurate as you can get it lol....

 

Thanks,

Chris

 

0 Kudos
Message 3 of 8
(3,628 Views)

Hi Chris,

 

when you know the sampling frequency you can calculate the timestamps instead of asking Windows to provide a stamp. The same is done with waveform datatype in LabView: it will store start time, time between samples and an array of data samples.

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 4 of 8
(3,623 Views)
Another option is to query the performance timer of the processor.  For Windows, do this with calls to the kernel DLL (see attached code, LV8.5.1).  In linux or Mac OS X, I believe a standard call to time will do the trick.
Message 5 of 8
(3,596 Views)

  I have to admit in the pass I would use the "Write LabVIEW measurement file" before, well until I relized how much overhead that routine takes up.  It would always give me a t0.  The whole reason I am trying todo this is to get an idea in realtime of when the first data point happens, and like you said I can just calculate the rest cause I know the frequency.  Is there anyway to pull that t(0) time ?

 

 

 

0 Kudos
Message 6 of 8
(3,588 Views)

Right now i am only running labview 8.0 and 7.1 I haven't installed 8.6 yet,,,,I would like to take a look at this routine though.

 

0 Kudos
Message 7 of 8
(3,586 Views)

So here is a newer version of what I am trying todo.  I figured out that I could actually extract the timestamp from the waveform.  Now take the case I am running 6000 Hz with samples to read 3000.  The first time I am dumping to the txt file on loop iteration 0 is that the time of the first sample? or is it the time of the end of that iteration which would be sample 3000. 

 

Thanks,
Chris

 

0 Kudos
Message 8 of 8
(3,566 Views)