LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Data acquisition and postprocessing with DAQmx and Scope simultaniously

Hello dear community,

 

I'm programming a Data acquisition tool which uses a NI PXIe-4492 (8 Channels acquisition with DAQmx) and a NI PXI-5922 (2 Channels with Scope).

 

The settings and start/stop of each channel shall be altered while acquiring, thus I implemented a Producer-(multiple)Consumers structure. Since saving the acquired data to the harddrive is the most important job, I split the acquiring & saving (within one consumer loop) and the postprocessing (do fft etc.) from each other.

 

Now my question:

- Is splitting the acquiring & saving part and the postprocessing part wise?

- I transfer the data from the acquiring loop to the postprocessing loop within Queues. Is there a better / faster solution?

 

Thanks in advance!

0 Kudos
Message 1 of 3
(2,774 Views)

 


@WalterBaum wrote:

 

- Is splitting the acquiring & saving part and the postprocessing part wise?

- I transfer the data from the acquiring loop to the postprocessing loop within Queues. Is there a better / faster solution?

 

Yes. But make sure you are writing as fast as you acquire. And Queue is the correct approach, you are in the right path.

-----

The best solution is the one you find it by yourself
0 Kudos
Message 2 of 3
(2,758 Views)

I agree with P@Anand, you definitely want to use Producer/Consumer to leverage the parallelism that LabVIEW provides.  Unless your data rates are insanely high, there shouldn't be a problem streaming them to disk (whose disk, by the way -- the disk on the PXI's controller, or your PC's hard drive?).  You might well consider two P/C systems -- a "high-speed" one, perhaps based around an RT-FIFO, from DAQmx to Disk Spooler, and a (potentially) slower one that copies the data to your FFT/Processing loop.  If you think that you might overwhelm the latter loop, but could afford to "miss" some points (for example, because you're just updating a display), you could use a fixed-length Queue and do a Lossy Enqueue (or, better, process every other batch of data).  All depends on Data Rates and Data Needs.

 

Bob Schor

0 Kudos
Message 3 of 3
(2,719 Views)