LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Best Way to Handle Datalogging between Parallel Loops?

Solved!
Go to solution

So the producer consumer loop would be within the sub-vi loop to produce (read the data) and consume (save the data) to an independent file? Then, each sub-vi loop would have an independent data-log file which would be combined to a single file but a completely separate loop from all of what was just described? Do I have the framework of that correct? 

Your subVIs would be producers. You would have another loop (or subVI) in your main vi which would be the consumer - receiving data from both (all) producers and logging it to file.

Message 11 of 16
(804 Views)

It's probably already mentioned, but i'd have a separate logging queue and simply queue up log messages from wherever i need.

/Y

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
Message 12 of 16
(800 Views)

I think I understand it now, and I updated to actually get a single consumer logging data from the (now) Sub-VI in the main loop. The issue I'm having is that my simulated data is producing so fast that the consumer seemingly can't keep up. 

 

If a run a 10 second test, the controls/graph on the front VI will complete, but if I kill the VI and go to check the datalog, only ~2 seconds worth of data are logged. I examined the queue on the dequeue side, and found that if the data is done logging in 10 seconds, then it seemingly takes minutes for all of that data to dequeue into the main VI. So if I wait to kill the main VI for a long period of time, it will appropriately log all the data. 

 

I've read up on ways to try and "batch dequeue" the data into arrays to then be logged into a spreadsheet but I can't seem to get that to work. I always run into conflicts between the array of batched data and what the write to spreadsheet function is anticipating. 

0 Kudos
Message 13 of 16
(766 Views)

Here's a suggestion -- until you can get one channel working properly, set aside the two-channel mess and get a single-channel (one QMH) version working.  Did you notice that the Simulate Signal Express VI (oh, how I despise Express VIs, they lure gullible students into "letting others do the thinking for them") has a setting to "Run as fast as possible"?  This means that it will do its best to overwhelm the rest of your code, particularly if you use Wait functions to slow loops down.

 

I like your "wire design", particularly the Dynamic Data Wire (another Pet Peeve that mainly serves to derail LabVIEW beginners -- avoiding them almost always is a good idea) from your first Simulate Signal VI that goes underneath the Case structure of the QMH, up the right side, over the top, and down to a tunnel on the input side, making a nice almost-square around the Case?  So artistic!  Straight wires going from left-to-right are so boring -- you might be able to glance at the code and see how the "data flows" if you kept your wires straight and parallel ...

 

I don't see a Producer/Consumer design here.  I tested your Simulate Signal function -- it generates 1000 sets of 100 points in 50 milliseconds, or 100,000/0.05 = 2 million points/second.  No wonder you can't keep up!

 

Learn to start small, and test often.

 

Bob Schor

0 Kudos
Message 14 of 16
(736 Views)

My apologies, I accidentally posted old code, no wonder you were so critical! Smiley Sad The Station Statemachine has a Wait function, which slows down the production enough that the consumer can catch up, as you pointed out. 

 

You're right, the Simulate Signal is overwhelming the code, it's my theoretical peak acquisition speed so in reality I may never hit it, but I did want to understand the best way to handle the consuming overload as opposed to way to handle it (slowing down production).  As of now, I think I have a framework that has everything for the single channel working as intended, expect for being able to consume fast enough to keep up with the un-restricted production. 

0 Kudos
Message 15 of 16
(713 Views)
Solution
Accepted by Glibby1234

@Glibby1234 wrote:  but I did want to understand the best way to handle the consuming overload as opposed to way to handle it (slowing down production).

The other method is to speed up your consumer.  In your case, you should stop using the Write To Delimited Text File.  It opens, writes, and closes the file each time it is called.  That constant opening and closing is SLOW.  Instead, open the file before your consumer loop and close it after the consumer loop.  Inside the loop, use Array To Spreadsheet String to format your data and Write To Text File to write it.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 16 of 16
(699 Views)