Real-Time Measurement and Control

cancel
Showing results for 
Search instead for 
Did you mean: 

Real-Time Time Stamp is Slow

The output of the "Get Date/Time In Seconds" in my Real-Time application is slow (I'm using a PXI-8186, LV7.1).

It appears to take 4m 50s for the real-time time to go only 4m. Unfortunately, this only happens within my big application - if I try to write a little VI to replicate the problem I don't see it. My big application is not carrying out any hardware I/O, but is communicating to a PC via TCP, and carrying out a number of control loop calculations and logging to disk. It also uses a number of timed while loops, and a lot of "Call By Reference Node" calls to dynamically launched, reentrant VIs (I'm using OpenGOOP classes). The problem occurrs even if I exit LV without closing the RT Engine VIs (If I log to disk at 500 ms, the time stamps show up as only 415 ms apart).
0 Kudos
Message 1 of 11
(21,978 Views)
Hi,
I guess I'm not really sure what you mean by "It appears to take 4m 50s for the real-time time to go only 4m."

How are you making sure that your code is supposed to run for only 4 minutes? How are you measuring the duration of your code? Can you please attach a small code snippet that still replicates this problem and shows me how you are exactly doing the measurement? Perhaps, there is something inconsistent in the way the measurement is being done.

If you send me the above info, I can try to help you out more with this issue.

Cheers,
Anu Saha
Academic Product Marketing Engineer
National Instruments
0 Kudos
Message 2 of 11
(21,964 Views)
Sorry I wasn't clear - here is what happens:

I display the output from the "Get Date/Time In Seconds" function on a front panel on a VI running on a Real-Time target (PXI-8186). I start the stop watch on my wrist watch when this value displays "12:00:00.000 PM". I stop the stop watch when the value displays "12:04:00.000 PM". The time on my wrist watch displays "00:04:50.00". It appears that the system time on the PXI-8186 is running about 83% as fast as it should be.

I've also confirmed that the problem exists by logging the "Get Date/Time In Seconds" output to a file every 500 ms. I use the "Tick Count (ms)" output to time the logging period (i.e. I log to disk whenever the tick count increments by 500 ms). If I open the file, the time stamps are only ~415 ms apart (i.e. "12:00:00.000 PM" followed by "12:00:00.415 PM" etc.).

As I said in my previous note, I can't seem to replicate this problem in a small VI. No matter what I try, the clock appears to be running at the correct speed. I will keep trying ...

Jaegen
0 Kudos
Message 3 of 11
(21,962 Views)
Hi,
Sorry about the delayed response. You could be seeing that behavior based on a few things. Is you VI running in Time-critical priority? You should make sure that your higher priority threads have enough sleep time to allow lower priority threads to execute and not starve. Also, ensure that you don't have Synchronous Display enabled on any of the Front Panel controls or indicators. You can check this by right-clicking on the control/indicator and going to Advanced.

How are you making sure your loop runs every 500ms? Are you using the Wait or the Wait Until Next ms Multiple function? Try using the RT specific wait functions available in Functions Palette » Real-Time VIs » Real-Time Timing. I have attached a piece of code that shows how I would test this. Can you try to implement this in your application to see if you get the same results? It may also be helpful if you could post your VI here so I can take a look at it.

Hope this helps.
Anu Saha
Academic Product Marketing Engineer
National Instruments
0 Kudos
Message 4 of 11
(21,953 Views)
I've managed to recreate the problem.

I've attached a version of your VI, modified to include 2 simple timed while loops.
Try running this VI with "Loop 1" and "Loop 2" set to TRUE.

Here is a summary of my results:





"# of Timed Loops Running""Loop Iteration""Loop Duration""Tick Count Duration"
01028510.29210285
11012010.20211208
21014310.15112682


It appears that the more timed loops running, the slower the "Get Date/Time In Seconds" gets.

It's interesting to note that the "Loop Iteration" value roughly matches the "Loop Duration", and not the "Tick Count Duration". The Tick Count Duration matches the time on my stop watch, so this seems to indicate that the loop period is more than 1 ms. But neither of the timed while loops appear to ever finish late, and the CPU load according to the Real-Time System Manager is only 28.3%, so the system should have time to complete such a simple loop.

Hopefully you can use this to look into this problem further. Just to remind you, I'm using a PXI-8186 and Windows 2000.

Thank you,

Jaegen
0 Kudos
Message 5 of 11
(21,929 Views)
Hi,
I was able to investigate this further. As it turns out, LabVIEW RT's "Get Date/Time in Seconds" function does not keep real-time in the RTOS - it falls behind. Format Date/Time String, and Get Date/Time String are also affected. The problem is that when the real-time OS runs code, it gives higher priority to this code, and the real-time OS prioritizes its bookkeeping routines lower. One of the bookkeeping tasks of the real-time OS (or any OS for that matter) is to keep system time. In LabVIEW RT, this is done with interrupt service requests, which get ignored when the real-time OS is busy running real-time code. However, note that when a system is rebooted, the system time is reset to be correct again.

The Tick Count function however does not display this behavior. This is because it gets its value directly from the hardware register and not via a service request. So one solution for your case it to use the Get Date/Time function to get the initial timestamp. Then use the Tick Count function to get subsequent timestamps by using the initial time as an offset.

Another solution is to use the timestamp VIs that use values directly from the hardware register, as the Tick Count function does. These VIs use low register-level functions to ensure the most accurate timing. In LabVIEW, go to File » New... » VI from Template » RT » NI Timestamp Code Timer.vit and open that template. This template uses the timestamp VIs I'm talking about. They are locate in C:\Program Files\National Instruments\LabVIEW 7.1\vi.lib\addons\rt\_RTUtility.llb if you want to access them directly. But the template will show you a good use-case.

So you can either use the Tick Count function as I described above, or you can use the timestamp VIs. Both will provide you with a valid work around. Please let me know if you need more help with this.

Have a great day!
Anu Saha
Academic Product Marketing Engineer
National Instruments
0 Kudos
Message 6 of 11
(21,907 Views)
Hi,
To clarify a bit more on why the RT system clock (accessed by the Get Date/Time function) lags when higher priority code is running. You may notice that you will see this lag even if your VI is running in Normal Priority. This is because you are using Timed Loops, which, on an absolute scale, fall between Above Normal Priority and Time-Critical Priority. This preempts the system clock update which runs in Normal Priority. Thus, if you replace the Timed Loops with While Loops, you will not see a lag.
The above lag will of course be more prominent if a VI (with or without Timed Loops) is set to run in Above Normal or Time-Critical Priority.

Hope this helps clarify this issue a bit more.

Cheers!
Anu Saha
Academic Product Marketing Engineer
National Instruments
0 Kudos
Message 7 of 11
(21,907 Views)
Thank you for the answers.

I still have one question, however:

You mentioned "note that when a system is rebooted, the system time is reset to be correct again". What happens when just the application is stopped and restarted? Will the system time upon restart be correct? I'm sure I could test this, but was wondering what the expected behaviour is. If it does take an actual reboot, is there a VI or function (or DLL call) that forces a system time reset? We really don't want to have to reboot the system every time we want to stop and start the program.

Also, while I understand how the priority of timed while loops works, I'm still confused about why the clock would be lagging when the Real-Time System Manager reports that the CPU load is only ~30%. Doesn't this mean the OS has 70% of the time to process interrupts? Is it that these interrupts aren't queued, they're just ignored? (i.e. only 70% of them are getting processed?)

Thank you,

Jaegen
0 Kudos
Message 8 of 11
(21,897 Views)
Hi,
The sytem clock will only get reset on reboot. The OS calls loads the system time from the hardware register only on bootup. However, if you use the other 2 methods I had mentioned in my earlier response, you will not need to reboot.

Even though the CPU load is 30%, it is during that 30% load that the RT Timer thread gets preempted. We cannot predict the correlation that therefore 70% of the requests are getting processed. This is because the RT Task Scheduler is not trivial in its operation. Thus a linear correlation is probably not predictable. But as the fact remains, any VIs at Above Normal, High or Time Critical priority, or a VI containg Timed Loops, all preempt Normal priority threads, such as the system timer.

So, I would suggest using the Tick Count or the other Timestamp functions I mentioned before. Hope this helps you understand the situation further.

Have a nice day!
Anu Saha
Academic Product Marketing Engineer
National Instruments
0 Kudos
Message 9 of 11
(21,889 Views)
(Sorry to keep dragging this thread on ...)

Just to clarify ... I understand that these other methods use counters/timers which are updated properly, and therefore I can reliably use them to measure time differences, but I still don't see any way to get the actual date and time from the RT system. Is this correct? It seems that the only time I can get a reliable date and time is immediately after rebooting the system. It is completely impractical for me to reboot the system every time I want to just stop and restart my program (it's very large, and downloading the VIs alone takes about a minute).

Even if I write a little program that I run every time I reboot which reads the date/time value and the tick count value and writes them both to a file (so I know where the tick count is relative to the date/time), I can't tell whether the tick count has rolled over. Does the value derived from the Timestamp functions (the "GetHighResTimestamp" DLL function) roll over? If not, I suppose I could use this, but it is certainly less than ideal.

On another note ... Isn't this a bug? There are 3 functions which just plain do not behave properly - is there a plan for you guys to fix this in a later rev? Is there a more direct work around, such as a DLL function you could tell me about (or write for me) that calls the real-time clock chip directly? If there really is no fix for this (or plan to), then it would be better for all of us if you just disable the "Get Date/Time In Seconds" function in LV-RT (especially since timed while loops are promoted as a great tool for RT programming).

Thanks for all your help,

Jaegen
0 Kudos
Message 10 of 11
(21,883 Views)