LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Advanced producer consumer architecture

Dear all,

 

I am working on an application in which I have to process large amounts of data obtained from a data acquisition system (blocks of ~ 30MB coming in at a rate of 5 times per second). There's quite a bit of hardware involved (2 synced data acquisition boards, 5 DAQmx USB interfaces, etc.). The data processing involves a lot of FFTs and interpolations and is computationally intensive. 

 

I already have a first version running, but I would like to strip it from all the unnecessary stuff that was added for testing purposes and optimize it for performance.  

 

Even although I have a good idea of the possibilities (I like to think I have), I'm struggling with many questions and I'm a bit lost how to tackle this problem. I was hoping you could point me in the right direction. 

 

First of all: The data packages come from 11 channels, 1.25MS each, 2 bytes per sample and are (currently) converted to DBLs for further processing. I have a powerful workstation with 2 CPUs (Xeon 5660, 6 cores each) and 24 GB of RAM, but I'm not sure the data processing can keep up with the data acquisition. I figured a producer consumer architecture would be the best approach to this problem. Is this correct?

 

Secondly: What is the best way to handle these large data sets? I have used lossy queues so far (to prevent memory overflow), but I was thinking of using queues in combination with DVRs, so I can process the data in it's own memory space, without creating copies.  

 

Thirdly: Currently I see large changes in CPU usage. Is it possible to somehow dynamically direct more processing power to the data processing, if the system has idle threads?

 

Fourth: Each process (consumer & producer) needs configuration...at the start of the program, but preferably changes to this configuration can be made at run-tim. The producer requires settings for the data acquisition system and peripheral hardware used in the system. The consumer requires some files to be loaded and calculations to be performed on these files, before the actual data processing can begin.

 

At the moment, both producer and consumer loop run a state machine with, in the Idle state, an Event Case. The Event Cases handle changes to settings or the loading of files if the appropriate buttons on the front panel are pressed. This results in a huge frontpanel with many different controls. I doubt that the huge front panel and specially the event cases in both the producer and consumer loops are best practice. Is it possible to handle all this in for example a separate GUI thread, which opens the appropriate configuration panel if requested from a menu bar and performs the necessary actions?

And if this is possible: how do I make sure the settings become available in the appropriate process? And how do I get access to handles to my data acquisition hardware in both the GUI and producer loop without wires going all over the place?

 

I think I will leave it at this for now, more questions might come up later. I'm not looking for complete solutions, but a couple of pointers in the right direction are very welcome. Hope you can help.

 

Thanks in advance!

 

Milan

 

 

0 Kudos
Message 1 of 5
(2,763 Views)

I assume you use a tab control on the front panel? If you want to eliminate the "one time only" input panels, why not move them to subVIs, and have the subVI pop up as a separate window when you e.g. click the "edit DAQ settings" button? Put the DAQ task in a queue and pass it where it needs to be passed? 

Best regards,

Jarle Ekanger, MSc, PhD, CLD
Flow Design Bureau AS

- "The resistance of wires in LabVIEW is not dependent on their length."
0 Kudos
Message 2 of 5
(2,749 Views)

Yes, everything you asked about is possible. A producer/consumer architecture would work well. You may also want to look at using an actor framework. If you have LV 2012 there is a nice example of an application using the actor framework. If you don't have 2012 you can still use. The LAVA forums has lots of good discussions on the actor framework.

 

Answers to more specific questions. Specifying which threads to run things on can be accomplished via VI properties. Under the execution options there is an option to specify the preferred execution thread. The default is "Same as caller" which generally means things will run on the UI thread. You can move you critical code to other threads as well as specify the priority of the VI.

 

With the large data sets you can use DVRs and post those to the queue. You will have to manage obtaining and releasing them so you don't end up with a memory leak. But this is a good method for passing large data sets around.

 

I would create a dedicated task that does nothing but your data acquisition. Make these as lean and mean as possible. Just get the data and send off to another task. You may also want to separate your processing from your UI as well.

 

You UI can contain the stuff for configuring the system. If you want the user to be able to modify it at run-time using tabs to organize your UI front panel is a good approach. You can also use a dialog box approach. You will need some type of messaging to your other tasks (queues, notifiers, user events) to pass updates to the configuration to the other tasks.

 

The key to getting the performance you want is to divide and conquer. Create dedicated tasks for the different types of processing you need. For example, a GUI task that does nothing but deal with the UI. Processing tasks to do stuff with the data you get and the acquisition task which does nothing but read data. This will allow more parallel processing to occur and will also let you give priority to specific parts of your system.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 3 of 5
(2,737 Views)

Hi Jarle, Mark,

 

Thanks for your input so far, I get the impression I'm not far off and will make a start setting up the new application.

 

However, I'm still wondering about the best pratice of handling hardware settings.

 

You suggested a task whose only job is performing the acquisition. I believe this would be a better solution indeed.

 

But if I do this, where do I process changes to the hardware settings (e.g. if I change the timebase in run-time)? Currently, a GUI process handles front panel changes (detects changes to the time-base settings etc.) and then sends a command to the appropriate (in this case producer) loop to go to a special state to change the settings.

 

Wouldn't it be better to keep it simple and have the GUI process (loop) make the changes directly. 

 

This would require the device handles being available in both the GUI loop and the producer loop... and I don't know how to realize this.

 

Is this indeed the better solution and how do I make the device handles avalaible in both loops and how do I keep them up-to-date, should they change?

 

Thanks,

 

Milan

0 Kudos
Message 4 of 5
(2,691 Views)

I would keep them separate and using messaging to indicate the configuration changes. Each of your tasks should be focused on a particular task. Things can start to get messy when you spread the stuff around. Also, keeping them separate means it is easier for you to change the UI, run headless or change your processing tasks without impacting other parts of your system.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 5 of 5
(2,662 Views)