From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

sample interval variability

The attached vi is used to sample eight analogue channels at an interval of 31,250microseconds or 32Hz with a cRIO-9074. When I look at the data file there is a sampling interval variation of up to plus or minus ~150microseconds but when I monitor the output from DIO7, which gives a 12msec pulse for every acquisition, with a counter/timer,  the variation is ~1microsecond.

 

Could somebody suggest why there is such a difference and what measures could be taken to reduce the sampling rate variation.

 

Best regards,

nos

0 Kudos
Message 1 of 9
(2,700 Views)

Hi nos,

 

I assume that when looking at the data file, you're using the timestamp to work out the interval? The problem with this is that the 'Get Date/Time In Seconds' function uses your system clock, which is not that accurate. In fact, the detailed help reads: "Timer resolution is system dependent and might be less accurate than one millisecond, depending on your platform. When you perform timing, use the Tick Count (ms) function to improve resolution."

I would therefore expect a relatively large variation, especially if you are looking at the microsecond scale (and I also assume you are running the code in Windows, not RTOS).

 

Do you need an absolute time? If you are only after interval timing, use the Tick Count (ms), this will give you a more accurate reading. 

 

Please let me know if you have any further questions.

 

Best regards,

Eden S
Applications Engineer
National Instruments UK & Ireland
0 Kudos
Message 2 of 9
(2,681 Views)

Thank you for your response. With regard to your questions see below.

 

I assume that when looking at the data file, you're using the timestamp to work out the interval?

That is correct, see attached Excel file; the included plot works out the variability by subtracting the actual sampling interval from the desired interval and plotting the results against sample number.

 

 

I would therefore expect a relatively large variation, especially if you are looking at the microsecond scale (and I also assume you are running the code in Windows, not RTOS).

The code is running on the cRIO-9074 and uses a timed loop that is synchronised with the Scan Engine.

 

Do you need an absolute time? If you are only after interval timing, use the Tick Count (ms), this will give you a more accurate reading.

Absolute time is not required; as shown in the attached file the first data sample should have a time stamp of zero, with each successive sample incrementing by the sampling interval.

 

Best regards,

nos

0 Kudos
Message 3 of 9
(2,670 Views)

Hi nos,

 

Thanks for the reply and the extra information. I think that the issue may still be in the method of timestamping. Have you tried using the tick count function instead of get time? Please could you give this a go and let me know the results?

 

Thanks,

Eden S
Applications Engineer
National Instruments UK & Ireland
0 Kudos
Message 4 of 9
(2,664 Views)

Thank you for your response.

 

I will do as you suggest and post the results within the next few days.

 

Best regards,

nos

0 Kudos
Message 5 of 9
(2,660 Views)

The vi has been modified by replacing the get date/time function with, firstly, the tick count function from the same palette and secondly with the tick count function from the real time palette.

 

The first change gave a time stamp with a resolution of only 1msec which is insufficient whereas the second gives a time stamp to microsecs; a data file from the latter option is attached.

 

The maximum sampling interval variation with the real time tick count is not very different from that obtained when using get date time but there are more interval variations concentrated around zero.

 

Any comments or suggestions appreciated.

 

Best regards,

cpgos

0 Kudos
Message 6 of 9
(2,645 Views)

Hi nos,

 

Sorry for the late response - I was out of the office at the end of last week.

 

A thought - are you running the code in Scan Mode? Because this has some limitations on the rate, and only provides accuracy in the ms, not microseconds.

 

Thanks,

Eden S
Applications Engineer
National Instruments UK & Ireland
0 Kudos
Message 7 of 9
(2,622 Views)

Thank you for your message and my apologies for not responding before now.

 

Scan Mode is being used, do you know if there are any tutorials etc available on the degree of sampling interval repeatability that is possible with Scan Mode?

 

The attached Excel files show the sampling interval variability both with and without and charts being used in the vi; charts were displayed when the data shown in the file "timing test 4.xlsx" was logged and no charts were used when the file "timing test 3.xlsx" was logged. The latter has a maximum variability of ~100micro secs and the former ~250micro secs.

 

Best regards,

nos

Download All
0 Kudos
Message 8 of 9
(2,583 Views)

A further final variation that I have made to the vi is to remove all of the analogue channels and replace them with constants; the resulting data file as an Excel sheet is attached.

 

This shows that the minimum degree of sampling rate variability is ~+/- 50 microsecs, also when the "RT tick count" function is replaced with the "get date/time in seconds" function the amount of variation increases to ~+/- 75 microsecs.

 

All of the foregoing discussion has not answered the original question but it has given a reasonable estimate of the maximum sampling rate variability that can be expected with the vi that is being used, ie ~+/-200 microsecs.

 

nos

0 Kudos
Message 9 of 9
(2,562 Views)