12-22-2016 06:34 AM
Hello everyone,
I'd rather start a new topic for that very problem.
I have been told that there could be a memory leak in my VI: displayed in this VI is the third loop of my P/C code. This should be the consummer one, where I obviously display my data and record them in a text file. the producer loop above should be timed at 500ms, hence that one too. As I want to record one set of data every minute, I store my data in an array for 120 iterations (~1 minute), then average them, and put them in an another array of strings (my case structure).
Some people told me that stacking them in this string array indefinitly could lead to a memory leak. I would like to tknow if there could exist some better way of doing what I aim to do, and if I can avoid that ?
Sorry if this is confusing, I am just trying to explain as clearly as possible what this loop does 🙂
Thank you !
Flo
12-22-2016 08:32 AM
Flo,
You really need to find a LabVIEW Guru and apprentice yourself for a week or two. Failing that, at least purchase Peter Blume's "The LabVIEW Style Book" and read it, cover-to-cover, at least twice (I've read it at least 4 times).
Bob Schor
01-10-2017 04:29 AM
Hi Bob,
Sorry for the lack of reply, but it was winter break 🙂
So I've quickly read through your advices, and I know I need to drastically clear my code. It will be done on time. While I clear it, let me give you the full version of it.
I think I haven't been clear enough in my first post. So here is my concern again:
- the second loop produces the data every ~500ms and sends them to the 3rd (consummer) loop.
- the third loop receives those data every ~500ms I guess (cadenced by the 2nd loop). the data are procesed and put into an array where every values are going to be averaged every minute (every 120 iterations). I then release the memory of the array (case True of my case structure in the 3rd loop).
I feel like I am doing what you suggested, expect that I do it in the 3rd loop, not in the 2nd one.
My concern is that I'd like to display those data in an array of string, and few people noted that displaying those data through time may lead to eat all my RAM.
Any suggestion to avoid that? is there a way to display those data without eating all my memory ?
PS: code hasen't been cleant yet, I will pass through all of your sugestion and do my best to make it more readible!
Cheers,
Flo
01-10-2017 05:17 AM
@Flo-w wrote:
My concern is that I'd like to display those data in an array of string, and few people noted that displaying those data through time may lead to eat all my RAM.
Any suggestion to avoid that? is there a way to display those data without eating all my memory ?
01-10-2017 10:17 AM
Yamaeda ? did you just qoted my post above ?
01-10-2017 10:45 AM
Aha, now I see the problem/question. Your "Consumer loop" wants to do two things -- save the data as they come in to a file (or "do one thing to each piece of data") and "Further process the data in two-minute segments". These are separate processes that operate on two different time bases (where have we seen this before?), and can efficiently be handled by (drum roll ...) a Producer/Consumer Loop!
Consider a "Data Averaging" queue. As the individual data points are dequeued by the Consumer, you first put them on a Data Averaging Queue (thus acting as a Producer) and also write them to a file (thus acting as a Consumer), formatting them for the file as you want.
In your fourth loop, the Consumer side of Data Averaging, you accumulate 2 minutes of data, average it, do whatever you need to do with it, then clear the array. To prevent the problem of building the Array up one element at a time, which potentially means allocating more and more memory, you can pre-initialize the array to 120 elements.
Does this make sense to you?
Bob Schor
01-10-2017 11:05 AM
Actually, I am not saving the data as they come in that 3rd loop, I am just saving them into a 2D array, then I average those values when the count is at 120.
So I still don't get why I should need a 4th loop ?
Cheers,
Flo
01-11-2017 02:04 AM
Hmm, seems my post was sent to the void ... I was basically saying that: While true in principle, is it a real problem? You can sent data to a Chart with a limited history length to limit memory used. Even if you keep it all in memory, will it fill up this side of 2049? 🙂
/Y
01-11-2017 03:59 AM
2049/Y ??
01-11-2017 04:56 AM
From what i gathered you had a low sample rate, and with todays big memories it'd take years to fill it. 🙂