Hi all,
I'm trying to monitor two signals at a 100 kSample/s rate (10 us/sample). I am using Visual C++ and NI-DAQ, and have set up the card for double-buffered digital input. I am reading the buffer directly, i.e., using DIG_Block_Check to return the number of samples remaining and transferrring from the buffer passed to DIG_Block_In, rather than using DIG_Block_Transfer. The data stream is being written to the hard drive. When I run this, however, Every buffer has several glitches in the data. For example, if I am monitoring a pulse that is 30 us long, the data I collect would lead me to believe that it was actually twice as long. Please help!