01-24-2014 10:28 AM
I have a system with three cDAQ modules. I initialize each to gather data at 2 Hz, then enter a timed loop that runs once per second. In this loop, I read each cDAQ using the -1 default value for number of data points to read (i.e., read all the samples in the buffer). Obviously, I expect two samples each program loop.
Instead, on some cycles I get 4 samples, and on some no samples at all.
The timed loop executes all code in ~400 msec, so the one second loop time should be comfortable.
Further, each loop I write data to a spreadsheet file. On most cycles, the time value increments by 1 second as expected. On the cycles with missing data, the time increments by a much smaller amount, ~0.2 seconds. Since I have the timed loop configured to skip late cycles, I do not understand what is happening and any suggestions would be appreciated.
01-27-2014 04:30 PM
Hello Hoard,
Would it be possible for you to post your code? Being able to look at what you're saying makes things much simpler. Also, what cDAQ modules are you using?
01-27-2014 05:37 PM
01-28-2014 05:09 PM
From memory, the cDAQs are 9172.
I can post the code but it is very large. Any suggestion how to bundle the whole program with subvis etc?
I was using a While loop with a Wait for Millisecond Multiple but this is of course not at all exact timing. I changed last week to a Timed Loop and it has greatly reduced the incidence of missing data, but not to zero.
Thanks.
01-29-2014 06:23 PM
Instead of setting the samples to read to -1, set it to 2. That way it will pull 2 samples off the buffer when it can. Just be sure to change the time out condition for your DAQmx read VI to something large (~4sec) to account for your OS being non-deterministic. Also just FYI, the timed loop does NOT ensure that it executes every second.
01-30-2014 10:47 AM
I used to do that, but it introduces another issue. If you don't read every second (for instance if a dialog box opens and you take 10 seconds to answer while the hardware keeps sampling(, then you leave samples in the buffer and read old data. Over a week of running, you get significantly behind, and eventually get a buffer overflow error.
Is there a command that finds out how much data is currently in the buffer? If so then we can read that many bites. Maybe that is a solution.
01-31-2014 02:38 PM
If you're looking to run your program for so long with determinism I would strongly suggest using a real time system. Having said that, could you possible use the "DAQmx Wait for Next Sample Clock.vi" to only read samples after the sample clock has pulsed?