LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Timing of DAQmx Acquisition Vi and data write

I am acquiring data from a DAQ rack, using a 6251 PCI card. after acquiring a data set, i write it to a binary file. afterwards, i do a small calculation on one channel value and apply a condition on this value. the outcome of which is connected to the while loop surrounding the acquisition. So i (1.)grab 100 points, (2.) write, calc. and cond., and (3.) start over again. I stuck millisecond timers in between steps 1, 2, and 3 to determine if i was incurring much delay. So, if i have no delay, and i'm sampling at 100 Hz, the differences between the millisecond timing values (stuck into an array) should be

1000
0
0
1000
0
0
1000
0
0
etc.

This is usually the case. However, sometimes i get something like this

900
100
0
900
100
0

or

980
20
0
980
20
0

So if i measure a delay as a result of the writing/calculations/conditing, this time is subtracted from the acquisition time interval. I don't really understand this at all. Why a delay after the acquisition would speed up the acquisition, i have no idea. My only guess is that i start writing/calculating/conditioning before acquision has finished, but why would it do that? I've attached basically the setup. Any help would be great.
0 Kudos
Message 1 of 6
(2,570 Views)
I don't have LV on my network PC so can't look at your attachment. However, the behavior you report doesn't appear surprising to me. The key to understanding it is to realize that when there's a delay caused by processing/writing, it's the next data acq time that is reduced, not the previous.

The reason why is that the data acq buffer is being filled by hw even while you do your processing/writing. So when you next ask to read a set of data, the hw already has a "head start" and the additional waiting time is reduced by the size of that head start. The hw read keeps synchronizing your software by completing at exact 1000 msec intervals.

I'd predict that the very first measure of data acq time will be independent of your processing/writing time, and that you can always sum time intervals 2 + 3 + (next) 1 and get a constant. You can test this by adding small artificial delays in the processing/writing step.

Hope this helps,

-Kevin P.
CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 2 of 6
(2,551 Views)
What is the "hw?" Does this mean that I don't have to worry about delay?
0 Kudos
Message 3 of 6
(2,541 Views)
Your right actually: the first interval is always 1000ms. What is the "hw?" Does this mean that I don't have to worry about delay?
0 Kudos
Message 4 of 6
(2,543 Views)
...what is the "hw?"

That's short for "hardware." Specificially, in a buffered acquisition the data is sent into your memory buffer at regular times regulated by a hardware clock found in your data acq board. Your software interacts with this memory buffer via 'Read' calls.

...worry about delay?

That'll depend on your app. As you've described it, you Read 1 second worth of data all at once then do some processing and storage. As long as you can do that processing and get back around the loop in less than one second, then the processing is not a bottleneck. The speed is being regulated by the amount of data you request and the time it takes to acquire it.

If this behavior is sufficient for you, then no, you don't have to worry about the delay.

Other apps may need to read smaller amounts of data and make decisions more quickly than once per second. Or they may need to instantly read the most recent data without waiting for any and then quickly make a decision. These sorts of apps might cause you to worry about the processing time.

-Kevin P.
CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 5 of 6
(2,533 Views)
thanks
0 Kudos
Message 6 of 6
(2,526 Views)