How does LabView time its loops? If LabView is assuming "0" run time for the loop, and sleeping a fixed 98ms before running it again (as opposed to sleeping 98-[actual looptime] ms), then you should be very close to but under 100ms/loop. A polling read from the 6009 took about 1.5ms to execute. 98+~1.5 puts your loop very very slightly faster than the sampling time. Eventually, it will get enough ahead that you could do 2 reads between a single pair of samples, and the second would give you all 0's. With a 100ms loop, your loop could very slightly too slow, and eventually the device would sample twice without any reads in between.
Of course this is speculation and makes assumptions about how LabView might do timed loops, but I've seen similar errors in timing code in C++ that produced "weird" clock skew over long time spans. I'm sure about the polling read time of 1.5-2ms though because I fought it for a few weeks (even re-wrote my entire DAQ library in C++ instead of C#) with my 6009 before realizing it's a limitation of the USB interface. On that note, my C++ and C# code ended up running at the same speed, since the real delays were not in my code but in the USB requests.
Message Edited by 280Z28 on 05-23-2006 11:37 PM
Message Edited by 280Z28 on 05-23-2006 11:37 PM