From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Problem with queue timing out?

Hi all,

 

I am using a queue to send a cluster between parallel while loops (producer-consumer); however sometimes, but not on all occasions, the enqueue in the producer seems to "freeze" or maybe timesout for no reason (no timeout value is wired to the enqueue or any other element related to that queue process; so I assume it never timesout)...

 

I have checked this and there is data in the cluster that is being enqueued, but the dequeue in the consumer loop is empty... This does not happen on every execution, just occassionally... If I stop and save the vi, it works fine again but the next occurence of this problem.... If I wire 0ms to the enqueue timeout it also works properly again...

 

Any suggestions?

 

Thanks,

Jack

0 Kudos
Message 1 of 8
(3,043 Views)
Is it possible to upload code or screenshot of what problem you are facing?
Thanks
uday
0 Kudos
Message 2 of 8
(3,034 Views)

Is it a single element queue?

 

IF it is try using the lossy enqueue function along with the preview queue element.

This way your producer will function regardless of what the consumers are doing.

The consumers will always have something to work with , even if it is prior data.

 

Depending on your design it may help or degrade the situation but it will certainly change things and that might offer you more insight as to where the issue really is.

 

 

 

 

 

0 Kudos
Message 3 of 8
(3,018 Views)

Hi all,

 

 

My code is a mess, but there is only one enqueue in the producer... I will upload it asap.

 

It is not a single element queue.. I am not sure how to make the cluster a single element.... Any help greatly appreciated.

 

Thanks.

0 Kudos
Message 4 of 8
(2,994 Views)

It's impossible to give any good suggestions without the code, mess or not. 🙂 (thanks for the heads up) Could it be that you're getting an error in the enqueue and that causes en empty cluster in the dequeue? Could you queue up too fast so the queue fills up? (though that should give memory issues, not empty dequeue), is the dequeue timing out?

 

/Y

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 5 of 8
(2,979 Views)

Hi Yamaeda,

 

Thanks for the input... Please see code attached (a number of subvis are a missing, but these only help with reading the data file and doing basic analysis on the data after the dequeue).... In a nutshell, there are 3 parallel while loops;  Loop 1: User Interface; Loop2 : Read, Convert, Plot; and Loop 3: Analyse and Write to File.

 

The enqueue with the problem is in loop 2 (inside the "Plot" state, then inside the "False" case)... The loop is a rough state machine.

 

There are no timeouts on the enqueue in loop 2 or corresponding dequeue in loop 3.... The dequeue does not timeout based on a constant "false" from an indicator when the problem occurs.

 

When the dequeue does not work, memory usuage keep increasing until a "Memory Full Error" and the vi aborts.... This does not happen when it runs properly.

 

The data being queued can be quite large (upto 1M rows x 7 columns of data and does occur quickly (basically, data in the top graph is min-max decimated based on graph pixel width as I read on the NI white paper for displatying large data sets; the data between the start and stop cursors is then sent as a subarray to the enqueue for a fully display on the bottom graph for closer analysis)... Perhaps the queue is filling? I did not know this was possible (I am using a100ms wait for loop timing).

 

Any suggestions greatly appreciated.

 

Thanks,

Jack

 

 

0 Kudos
Message 6 of 8
(2,964 Views)

Hi all,

 

Ok... Are spending another 3 hours on this problem scratching my head, I have managed to find the cause for this weird behabiour... Problem was with the inputs wired to the "read tdms" vi that was used to generate the subarray data to be queued (which is calculated fron graph cursors that are used to select the data of interest)... Looks like I was reading beyond the length of the file. Although I cannot explain why the memory was increasing in this circumstance...?

 

Such a simple fix... I'm back in love with Labview... Until my next stall that is...

 

Many thanks to all...

 

Regards,

Jack

0 Kudos
Message 7 of 8
(2,942 Views)

I'm pretty sure the Read from TDMS give an error out when reading outside the file, an error you dont check. 🙂

/Y

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 8 of 8
(2,933 Views)