Showing results for 
Search instead for 
Did you mean: 

Is it safe to hold 20,000 * 100 samples in queue?

In my project I need to log critical data that is if any fault occurs I need to save the data that is 10 seconds before and 10 seconds after the fault occurred and then keep on collecting the data and hold the data for the next fault. So for this purpose I have created a queue that can hold 20,000 1D data and after 20 seconds it will get overwritten. So I am always holding 20,000 * 100 samples in my queue. Will that create any memory issues? If yes is there anyother way to do it?


The best solution is the one you find it by yourself
0 Kudos
Message 1 of 5

20000*100= 2,000,000 samples. If it's integers it's 8Mb memory. You should be fine. 🙂



G# - Award winning reference based OOP for LV, for free! ADDQ VIPM Now on GitHub
"Only dead fish swim downstream" - "My life for Kudos!" - "Dumb people repeat old mistakes - smart ones create new ones."
0 Kudos
Message 2 of 5

What you should look at is to preallocate the memory , see OPTAIN QUEUE help

and than have a look at the LOSSY ENQUEUE ELEMENT


However, from what I got from your task, you need a ringbuffer and want to store the data in case of an event. What will happen if two events arrive between 10s?Streaming 100kSample/s to disk?

Maybe a classical ringbuffer (chunk of memory and some pointers) with in place structures is more flexible. 

And your buffer needs to be longer, since you have to store 10s+Xs of data while getting new data.



Greetings from Germany

LV since v3.1

“ground” is a convenient fantasy

'˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'

0 Kudos
Message 3 of 5

Thanks Yamaeda am using DBL data here.




  Yes am using Lossy Enqueue element only. Here we don't give the option for 2 occurance of error. If one error occurred my state will be Fault and whatever the fault that happens after that will not be considered but logged until I clear the 1st fault.


Before I proceed I will take a look into the Ring buffer you said




The best solution is the one you find it by yourself
0 Kudos
Message 4 of 5

I agree. This amount of data should not be a problem.


When using large queues, I suggest allocating the needed memory up front by filling the queue with fake data and then flushing the data right when the queue is created.

Otherwise the memory must be allocated in multiple steps the first time you fill it (because LabVIEW doesn't know how much memory you will need).


How you handle the data once you dequeue it can also affect your program performance.

Here are some tips: Large Data Sets



Help the forum when you get help. Click the "Solution?" icon on the reply that answers your
question. Give "Kudos" to replies that help.
0 Kudos
Message 5 of 5