LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

consumer loop not synced with producer? getting less data than I should

Attached is the vi, and a sample file of the data (albeit much much smaller). Basically I have a bunch of daq channels, and occasionally if something interesting happens I want to make a not of it in the description box, and place an event marker in the data file. this marker number will tick up and record a value in every cell on the event number column for that particular event. The event description column will only place an event description and time stamp in the cell at the start of the new event and the rest should be empty strings. My problem is I am getting 1/10 the data for the event markers (pretty much anything from the producer loop that isn't my data does this) than I am for the actual data. Why is this happening and how can I correct it so they are in sync?

 

The part to look at in the data file is on the last two columns in excel. If you look at the time stamp of the events you can see that they aren't correct for that data line. And in the collection summary on the first page I have way less data for the event writes than I do for the regular data coming in.

Download All
0 Kudos
Message 1 of 15
(2,924 Views)

You have the classic situation where your consumer loop is slower than your producer.  This is quite common when it comes to logging to disk.  The reason you are missing data is because you are killing your queue when there is still data in it.  When you destroy the queue, all of the data still in the queue is lost.

 

First thing I would do is redesign what you pass into your queue.  Make it a cluster with the data that you can just write directly into your Write TDMS.

 

Secondly, use just one Write TDMS.  This will make the writing to disk faster, therefore not slowing down your consumer loop so much.

 

Finally, come up with a way to tell the consumer loop that it is done logging.  In this situation, I would send an empty array.  So where you currently have the release queue, enqueue an empty array.  Then you can check for the empty array in the consumer loop.  If it is empty, you stop the loop.  Release the queue once the consumer loop is complete.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 2 of 15
(2,915 Views)

@crossrulz wrote:

You have the classic situation where your consumer loop is slower than your producer.  This is quite common when it comes to logging to disk.  The reason you are missing data is because you are killing your queue when there is still data in it.  When you destroy the queue, all of the data still in the queue is lost.

 

First thing I would do is redesign what you pass into your queue.  Make it a cluster with the data that you can just write directly into your Write TDMS.

 

Do you mean use the bundle function to create a cluster of everything going into the enqueue element blocks? How can I mix the string entries with the waveform ones, does this allow this or will I still have two separate queues?

 

Secondly, use just one Write TDMS.  This will make the writing to disk faster, therefore not slowing down your consumer loop so much.

 

Similar question, how can I have both string and waveform data writing to tdms. Also, is it better to have the time component of the waveform functions in the producer or the consumer loop?

 

Finally, come up with a way to tell the consumer loop that it is done logging.  In this situation, I would send an empty array.  So where you currently have the release queue, enqueue an empty array.  Then you can check for the empty array in the consumer loop.  If it is empty, you stop the loop.  Release the queue once the consumer loop is complete.

 

I'm confused by this, do you mean outside the producer while loop? That release queue? I would replace that with anotehr enqueue element and wire an empty waveform array to it. How do I tell the consumer loop to check for empty arrays and stop?

 


 

0 Kudos
Message 3 of 15
(2,907 Views)

1 & 2.  I would use a cluster with two elements.  The first an array of timestamps and the second an array of strings.  Ok, so you need to use 2 Write TDMS calls.  That is still a lot better than 8.

 

3. Yep, the release queue right after the producer.  Replace that with an enqueue so that at least one of the arrays in that cluster is empty.  There is a very simple function in the comparison palette called "Empty Array?".  Checking doesn't get much simpler than that.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 4 of 15
(2,902 Views)

Two separate tdms writes, and two separate queues correct?

 

I tried to cluster the waveform data using the bundle just now and I can't wire it to the enqueue element block. get an array mismatch error. I tried budnling all 5 daq waveform streams into a single output, what do you mean by a cluster with 2 elements? I think I am confused on what is getting clustered where and what belongs in the producer loop and what in the consumer. Right now I have the being extracted in the consumer loop is this wrong?

 

For the life of me I cannot figure out how to cluster all the waveform arrays into a single stream. What is it I need to wire to the open queue data type slot for it to accept , my newly bundled cluster? I keep getting data mismatches and I have tried every combination I can think of.

0 Kudos
Message 5 of 15
(2,893 Views)

I changed the VI based on what I understood from your recommendations, still broken though.....

0 Kudos
Message 6 of 15
(2,860 Views)

You only want one queue.  The queue data type should be a cluster for everything you need logged.  In it should be an array of waveforms for ALL of your waveform data (use Build Array coming out of your DAQmx Reads) and an array of your string data.  So you only have 1 queue.

 

So on the consumer, you only have 1 Dequeue.  You just need to unbundle the waveforms and the strings and write those the the TDMS.  If either of those are empty, stop your consumer and release the queue.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 7 of 15
(2,828 Views)

@crossrulz wrote:

You only want one queue.  The queue data type should be a cluster for everything you need logged.  In it should be an array of waveforms for ALL of your waveform data (use Build Array coming out of your DAQmx Reads) and an array of your string data.  So you only have 1 queue.

 

So on the consumer, you only have 1 Dequeue.  You just need to unbundle the waveforms and the strings and write those the the TDMS.  If either of those are empty, stop your consumer and release the queue.


So I got rid of all the separate queues, and am trying to group everything into one, but I can't get everything into the same cluster and wire that to the same enqueue element. I have the timestamp, the actual waveform data, and two strings. I wired to my element type on the open queue the same cluster only using constants as the elements but it still wont let me wire to the enqueue element block. what am i misunderstanding here?

 

As for the consumer side, when you say unbundle, I tried that but cannot wire to tdms write unless I rebuild the unbundled cluster back into an array, which even then is not accepted as a polymorphic data type or some error.

0 Kudos
Message 8 of 15
(2,807 Views)

attached some pics of what I am trying

Download All
0 Kudos
Message 9 of 15
(2,795 Views)
I followed all of your suggestions and am still not getting an event number for every data point. Why is my true false case so much slower than my data?
0 Kudos
Message 10 of 15
(2,766 Views)