Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

triggered aquisition for determined time

I am using a PCI-6371 to trigger a continuous AI aquisition on 8 channels of a NI-4472 at 15 ks/s, while displaying and writing the data to a file. I am currently using the get date/time in seconds function to determine when to exit the loop, but the time is not accurate enough. I need to aquire data for a pre-determined amount of time with a high degree of accuracy. Is there a way to set up a timed AI aquisition as opposed to continuous in a loop, while maintaining the display and file writing aspects?
0 Kudos
Message 1 of 4
(2,937 Views)
Couldn't you get your precise time by requesting the appropriate # of AI samples to read? For example, to break out of the loop once per 100 msec, request 1500 samples in your "Read" call. The call will return with your data at the desired time, and under DAQmx it also won't lock up your CPU while waiting.

If your display/file update rate is low, you can probably dump all the data both places and still make it back into the next iteration of "Read" in time. If you're updating in the tens or hundreds of hz, you may need to get a little fancier.
Some ideas are to decimate display data (no point sending 1000 data points to a graph that's 300 pixels wide), and to push data into a queue that a low-priority file writing subvi can access a
t a slower rate, writing bigger data chunks each time.

Reply if you need more info.

-Kevin P.
ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
Message 2 of 4
(2,937 Views)
I am monitoring position sensors on an object that I move to 12 different positions, one every 15 seconds. I must catch all of the data, including the motion from one position to the next, for total test time of 180 seconds. I am using the cont acq to spreadsheet File.vi. I haven't been able to get all of the data any other way. Do you have an example? Thanks.

Dave
0 Kudos
Message 3 of 4
(2,937 Views)
Dave,

I don't have a particular example here, but looking at the shipping example you mentioned, here's a couple quick things to try:

1. Increase your buffer size from the 2500 scan default. Give yourself a few seconds worth of buffer, say 4 seconds worth for a size of 60000.

2. Increase the # of scans to read (& display & file) at a time. The minimum I'd bother trying is 500, which would correspond to updating your chart 30 times a second. You may find you need to increase it from there however.
Operating on larger chunks of data less frequently lowers the fraction of overhead cost such as formatting to spreadsheet string, updating chart.

3. Consider writing to a binary file so you don't have to convert all that data
into a string format before writing to file.

Hope these ideas are enough to get a trial version running for you.

-Kevin P.
ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 4 of 4
(2,937 Views)