LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

DAQ Assistant vs. Producer-Consumer architecture

Hello all. I know my subject line is generic but bear with me. I'm a grad student looking for perspective on best programming practices based on what I've observed. Basically I think the way my research group does stuff is totally bogus and looking for some experts to back me up, or tell me I'm way over my head ...... haha .......

 

So my group places the DAQ Assistant and the Write to Measurement File Express VIs in a while loop for all of our acquisitions. Based on what I've read this architecture is very inefficient as the channels and write file are reconfigured every single loop iteration. Somehow, my group has managed to get data this way for a variety of applications.

 

Recently I used this style for a 1 Hz acquisition with two 9219 boards, measuring 4 channels on each. We don't have a nice compactDAQ chassis so each board is on a separate USB-9162 carrier. Given this setup, the acquisition gives Error -200279 after about 1 hour stating the application is not able to keep up with the hardware acquisition. My advisor said he has had success running the DAQ Assistant for each 9219 board in two separate VIs to get around this problem. This made me cringe. I also believe with two different USB-9162 carriers, the boards would be running on two separate clocks, is that correct? Which means they would not be properly synchronized?

 

Instead I wrote a program that uses a producer-consumer architecture. I make as many measurements as I can on a 6211 multifunction board (which has more channels than a single 9219) and use hardware timing, with all signals put into the queue. Then in the consumer loop, I call a Read VI on a separate task set for on-demand timing on the 9219. I observe that if I try to gather more samples with on-demand timing, or even add additional display charts in the consumer loop, then the queue with the 6211 signals backs up. This means the data gathered on the 6211 slowly lags behind the "real-time" events happening in our test. However, it runs very nice with no backlog if I reduce the samples requested through on-demand timing and keep the display charts to a minimum. Am I on the right track, assuming these two factors could cause the queue to back up?

 

I guess there were two questions in there. Can you think of doing anything differently? Is it any better to do this method, with hardware timing on 6211 + on-demand timing on 9219 --- instead of unsynchronized hardware timing separately on the two boards? At first I thought I would do everything on the 6211 board. But we are measuring two thermocouples, and the cold junction compensation built-in on the 9219 is needed to get accurate temperature readings. 

 

I'm attaching my files here too, the main program is in "main.vi" ........ I hope everything looks OK, especially my on-demand timing task to the consumer loop -- does that make sense to put it there?

 

The last catch all I put in was a Queue Flush for the 6211 signals if the queue ever reaches 5 elements length. For my 1 Hz hardware timed settings, this would be 5 s of data. For our application, it is OK to lose this data -- better to prevent the backlog from getting too large because there are relatively sudden events that happen after a long time.

 

Thanks all for reading my long post 🙂

 

Jeff

0 Kudos
Message 1 of 16
(2,958 Views)

P.S. I'm also realizing that all this would be way easier if we just used the two 9219 boards on a compactDAQ chassis with multiple slots. i.e. the way they are intended!

 

But we have to face facts that $$$ are tight and our group is always Frankensteining these things together..heh .....

0 Kudos
Message 2 of 16
(2,935 Views)

I am unable to open your main.vi completely.  In particular, I cannot shift the diagram to the right so that I can see the beginning of the DAQmx code (LabVIEW puts a big red X in the upper-right corner and Microsoft says "LabVIEW 20.0.1 Development System is not responding").  I can see all of the Front Panel, but when I try to view the Block Diagram to see what comes before the DAQmx Create Channel function, LabVIEW stops responding.

 

Your code is puzzling.  It looks like you are sampling something (I can't tell how many channels) at 10 kHz, taking 1000 samples at a time.  You bundle each sample with Samples/Channel (why?) and put it into a Queue.  So far, so good.

 

I didn't carefully examine your Consumer loop.  Assuming you have a 10 kHz sampling rate, you need to be able to process 10,000 samples each second.  Have you timed your Consumer loop to see if it can handle that rate?

 

Has a competent LabVIEW developer ever seen this code in action?  It looks very strange to me.  Are you writing out a very small subset of your data?  How many channels are you acquiring?  Why are you acquiring data at such a high rate, only (apparently) to average it and compute Mean and Median (and why both)?

 

Bob Schor  

0 Kudos
Message 3 of 16
(2,885 Views)

Producer Consumer should never be used for DAQmx acquisition unless you have specific needs. DAQmx has built in TDMS logging that is highly efficient and easy to implement into code. If you need a different file format, post-process the data, use the free Excel plugin for TDMS, etc.

 

mcduff

0 Kudos
Message 4 of 16
(2,882 Views)

Oh sorry you couldn't open the file properly, Bob. Believe I uploaded a 2016 version although I could save a 2020 version if that helps.

 

I attended a 3-day LabVIEW bootcamp 7 years ago. Have never managed to work with anyone truly well versed in standard programming techniques! 

 

I thought the producer consumer was a recommended architecture? I am logging to TDMS file in the consumer loop.

 

I'm measuring 6 channels on the 6211 at 10 kHz. The averaging was added to cut down on noise in the readings. Interestingly the median was even better when we had some bizarre short-burst power spikes in our field van, where the DAQ was set up on a construction site.

 

I am monitoring the number of elements in the queue as it dumps from the DAQmx Read in the producer loop into the consumer loop. The elements accumulate especially when I add more display charts and/or ask for more on-demand samples from the 9219. However the main.vi I uploaded does not accumulate elements, which I think means the consumer loop is keeping up fine now.

 

I hope that clarifies things. I was a bit concerned I do things unusually but any specific things I should change? Just do all tasks and logging in one loop?

0 Kudos
Message 5 of 16
(2,857 Views)

Look in the Example Finder for TDMS logging with DAQmx.

 

mcduff

0 Kudos
Message 6 of 16
(2,851 Views)

Here's my main.vi saved as a 2020 file, in case anyone's interested.

 

I added a control "Log to File Every N Loop Iteration" so the user can control how frequently data is logged to the tdms file. My advisor says the tests will run for months!

 

I decided I'm happy with how the On-Demand timing is working for the thermocouple measurements on the 9219. I ran this program over the weekend and no backlog in the queue with the 6211 signals.

 

Really the On-Demand timing performance was what I was most curious about -- that, and whether it is worth it to do all this to get away from using the DAQ Assistant + Write to Spreadsheet VI.

0 Kudos
Message 7 of 16
(2,756 Views)

@mcduff wrote:

Producer Consumer should never be used for DAQmx acquisition unless you have specific needs. DAQmx has built in TDMS logging that is highly efficient and easy to implement into code. If you need a different file format, post-process the data, use the free Excel plugin for TDMS, etc.

 

mcduff


I'd add the caveat "... if you're logging your data to disk." I rarely, if ever, need to log during acquisition. I generally need to sample data and display it on the screen, and use producer-consumer all the time. Generally I need to run a test, do some math on the data, and display it to the user. If they like what they see, they can choose to save a test run to disk. Otherwise they might tweak a setting here or there and run the test again to see if the part is behaving the way it should. Generally my users save maybe 10% of the data they acquire since they're using the software to view their tuning results after each tuning attempt.

 

Now I also want to be clear that my tests are considered "long" if they run for more than 3 minutes, not 3 months. For a 3 month test I'd definitely look at TDMS logging.

0 Kudos
Message 8 of 16
(2,749 Views)

@BertMcMahan wrote:

@mcduff wrote:

Producer Consumer should never be used for DAQmx acquisition unless you have specific needs. DAQmx has built in TDMS logging that is highly efficient and easy to implement into code. If you need a different file format, post-process the data, use the free Excel plugin for TDMS, etc.

 

mcduff


I'd add the caveat "... if you're logging your data to disk." I rarely, if ever, need to log during acquisition. I generally need to sample data and display it on the screen, and use producer-consumer all the time. Generally I need to run a test, do some math on the data, and display it to the user. If they like what they see, they can choose to save a test run to disk. Otherwise they might tweak a setting here or there and run the test again to see if the part is behaving the way it should. Generally my users save maybe 10% of the data they acquire since they're using the software to view their tuning results after each tuning attempt.

 

Now I also want to be clear that my tests are considered "long" if they run for more than 3 minutes, not 3 months. For a 3 month test I'd definitely look at TDMS logging.


Once you set up TDMS logging, it is not an always save operation. A simple DAQmx property node is all you need. The case statement below is what I use in my programs when the user hits the save button.  (I also give the option of log only, that is, save without seeing the data. This mode is the most efficient.)

 

Turn logging off, that is, don't save data. But you can still display it, no problem.Turn logging off, that is, don't save data. But you can still display it, no problem.

Save the data. Log only saves it without displaying it, Log and Read will save and display.Save the data. Log only saves it without displaying it, Log and Read will save and display.

 

@OP When you have time I highly recommend refactoring your code. Right now, it is hard to debug and add features when needed.

 

mcduff

0 Kudos
Message 9 of 16
(2,745 Views)

@mcduff wrote:

Producer Consumer should never be used for DAQmx acquisition unless you have specific needs. DAQmx has built in TDMS logging that is highly efficient and easy to implement into code. If you need a different file format, post-process the data, use the free Excel plugin for TDMS, etc.

 

mcduff


But this is only when you're logging right? Wouldn't you need to use a similar architecture if you wanted to display data to the user in "real time" ?

 

In my case some people didn't want to switch to RT or PLCs, so they wanted all control and acquisition to be done through DAQmx. Of course, its not like that was ideal...

0 Kudos
Message 10 of 16
(2,728 Views)