I have an application that basically needs to run as fast as possible given an input signal from a sensor on a vehicle. This frequency I get can range from 300 Hz to 2.2 kHz. I am also sensing three other analog inputs. This is how the vi should work:
1. Input signal is passed to DAQ board counter (GPCTR0_GATE). The counter is configured to count on each rising edge.
2. When a rising edge is detected, sample the three analog input channels and dump these values to a buffer (array).
3. Capture the period of the input signal at the same time I'm getting the three analog channels, dump this to the buffer as well.
4. Continue this process until a stop button is pressed.
5. When the test has completed, copy the buffer to a database.
Basically, what we're trying to do is capture readings based on an external clock that can vary with speed. While testing the application I wrote, I've noticed some serious problems, and this is where I need some expert advice on how to handle them. If the input frequency were constant at 1 kHz, and the test was run for exactly 10 seconds, I would expect to see 10,000 data readings copied to the database (in a perfect world, at least). What I get depends on whether or not I'm requesting the period of the input signal - if I only request the three analog input channels, then I get around 10,000 data points. If I request the period as well as the three analog channels, then I may get 20% (2000 data points) - and the period is wrong at the lower frequencies (300 Hz to 900 Hz).
I'm at a loss here. I've written this using traditional NI-DAQ as well as with DAQmx. Neither one gives me the input period right, and the number of data points captured decrease significantly when I try to get the period. I've looked a the examples, used alot of the examples, but still no luck. Any help would be greatly appreciated.
My configuration:
LabVIEW 7.1
PCI-6014 on the PC and a DAQCard-6062E on the laptop
Windows XP