12-01-2011 05:15 PM
Hi everyone.
I have discovered a strange issue with my DAQmx application. If I let it run for a while, it causes my windows system clock to "lag". The system clock slows down. It is the exact issue mentioned here:
http://digital.ni.com/public.nsf/allkb/F2AB7F9633718199862568B1007D5E3D
However, I am using a USB X Series device, so I can not change to DMA mode. I am forced to use the USB Bulk transfer mechanism.
I was wondering if anyone out there has seen this issue, and perhaps found a resolution to it. I am also curious if this lag is "permanent" or if the clock recovers after the application stops executing (I am trying to determine this through testing, but it actually takes a long time for the lag to accumulate).
By the way: I am running on a Core i5 Win7 x64 machine using LV2011.
12-05-2011 08:19 AM
I'm just gonna bump this to the top once. See if anyone has any ideas.
12-05-2011 05:19 PM
Hi josborne,
As the document that you found explains: "What actually happens is that the processor is not able to service the internal service requests for system clock updates as frequently as it needs to, so the system clock appears to slow down.", therefore, it is not permanent, it only appears to slow down the clock cause the processor can´t refresh it as often as usual since it is being used up by the DAQ device.
If the appearance of the system clock is important for you, then you can avoid this "lag" by adding a wait function to your code, that way you will allow the processor to perform other tasks and the system clock will be able to catch up.
Regards,
12-06-2011 03:14 PM - edited 12-06-2011 03:17 PM
Hi josborne,
The word "appears" was used a bit liberally in that KnowledgeBase. The truth is, the clock appears to slow down because it is actually slowing down--while the OS is servicing interrupts from the DAQ device, it might possibly miss an interrupt to update the system clock. Over time, as clock interrupts are missed, the system clock will fall behind what you would expect it to report. After the task stops, so do the interrupts that block the system clock from being updated; however, there will still be an offset in the system clock due to the previously missed interrupts.
While Interrupt data transfer mode is the worst offender, DMA and USB Bulk still require interrupts and so could be susceptible to the same beahvior but at a slower rate.
Now, to answer your questions--I have actually not yet encountered this behavior before (the KnowledgeBase article is from 2000). By default, the Windows Time service will synchronize your system clock to your network. As I do 99% of my data acquisition on a corporate network, any clock slowdown that I might encounter would be corrected by Windows.
I'd like to see if I can get the same behavior with clock synchronization disabled. Can you elaborate on how you are using your DAQ hardware (with code if possible)? Depending on what you need to do, perhaps there might be a workaround. Also, can you elaborate on the degree of clock slowdown that you are seeing?
Best Regards,
12-06-2011 06:55 PM
Thanks, John P. Very helpful.
Unfortunately, it is as I feared. That the clock lag becomes "permanent" until the next time Windows synchs its clock over the network. My challenge is that our system will be running for long periods of time ... so the lag will build up a lot . Also, it is very likely the PC will NOT be connected to the network ... so Windows will be unable to correct the lag. The reason the PC is not on the network is that it will be deployed in a hospital and be used to gather physiological data from patients. Connecting to the network in a hospital is a big IT headache. Not to mention the patient confientiality and security issues.
My quick guess is that we are losing about 0.5 seconds per day.
I will see if I can isolate the code and upload a simple example that demonstrates the lag.
Get back to you tomorrow.
12-08-2011 08:50 AM - edited 12-08-2011 08:50 AM
Attached to this post is a VI that demonstrates the clock lag phenomenon.
It runs a simple DAQ task which acquires ten AI signals at 1 kHz apiece. I am using an X-series USB-6351 for this example. It compares three three clocks:
Tatomic: the atomic clock (retrieved via an internet NTP query)
Tdaq: the clock on the DAQ card
Tpc: the system clock
It then calculates the differences between these 3 clocks over time. The differences should vary (eg noisy), but they be "steady" over time. You should NOT see a drifting pattern.
If you let it run for a few hours, the results show that the PC clock is slowing down... while the DAQ clock and the ATOMIC clock remain perfectly in synch. My test shows a lag of close to 0.06 seconds per hour. Below is a quick screenshot of the results:
Btw: I have turned off the Windows Time Service.
12-08-2011 09:18 AM
Wow! I am finding some interesting artifacts here:
1. Clock lag still occurs if we use a SIMULATED daqmx device. I setup a simulated USB-6351 and used the same code. Same result.
2. If I remove the DAQ task completely, the clock lag goes away. So the lag is definitely caused by DAQmx.
And even more interesting: If I subtly change my code, I can resolve the issue. See below. The code circled in red will resolve the issue.
CONCLUSION: Apparently, having your code pause execution inside DAQmx Read.vi and have it wait for the desired number of points is the cause of the problem. The underlying DAQmx driver probably overtakes the CPU (maybe it sits in a very fast while loop) and blocks the interrupts required by Windows to update the system clock.