I have an app which performs a few function but basically takes in data from 3 AI inputs and outputs to one. The problem I am having is in trying to correctly adjust my sampling rate. The way the app is set now there is a large delay between what shows up on my graphs and what is happening in real time. This delay is somehow linked to my continous sampling rate and number of points to get. It is as if the data is being buffered somewhere because my loop is running too slow to retreive it all. I am attaching my app. I have the cheapest M-series card and and only trying to sample at 1000 hz and 100 samples per sec. Whatever works really. Thanks. Tim LeDoux Westmont College
I can't open the VI because I "only" have LV 6.1, but here's what I know about the continuous scan VI.
The continuous scan VI buffers the data acquisition in memory. It does this using DMA transfers to reduce the burden on the CPU. When initialising the continuous scan VI (Remember to wire the i terminal), you set the max size of the buffer. Your program can read out at any speed, and even make a pause once the buffer doesn't overflow. If your program isn't reading the data quickly enough, this can lead to lagging behing the "real-time" buffer.
If you're using continuous acquisition, your loop doesn't really need a wait function, as the continuous acquisition VI will basically control the loop times for you. If there isn't enough data in the buffer yet, it waits.
Can you post a pic of your diagram? Maybe I can spot something then (For all who son't have LV 7)