12-08-2017 10:59 AM
@dhaugen92 wrote:
I've tried instead, inserting the dump from the subVI into the already established array:
This is completely pointless and requires the exact same new memory allocation as the original code.
12-08-2017 11:01 AM
@altenbach wrote:
@dhaugen92 wrote:
I've tried instead, inserting the dump from the subVI into the already established array:
This is completely pointless and requires the exact same new memory allocation as the original code.
even if the array only has 3000 values but is initialized to a length of 10000 and im only inserting 1000 values?
12-08-2017 11:07 AM
Yes show us more code.
Old joke time.
Guy 1 finds guy 2 crawling around on the living room floor and asks "What are you doing?"
Guy2 says "Looking for my contact."
After a while guy1 asks "Where were you when you lost it?"
Guy2 answers "In the basement."
Guy1 asks"Why are we looking in the living room?"
Guy2 answers "The light is better."
Moral:
We will never find something if we are looking in the wrong place.
Show us more code please.
Ben
12-08-2017 11:08 AM
Im 98% sure the error is being generated from this section of the code.
The rest of the code is proprietary.
12-08-2017 11:19 AM - edited 12-08-2017 11:20 AM
An "out of memory" error happening about every 4 hours screams memory leak to me.
Not sure if this will help, but I've attached a resource monitor VI that I've used in the past. You could easily modify it to log available memory to disk over the 4 hours and see if it is creeping up at all.
12-08-2017 11:22 AM
@dhaugen92 wrote:
Im 98% sure the error is being generated from this section of the code.
The rest of the code is proprietary.
Then create a VI with that code, put it in an unthrotled loop and build it. That test should show the leak.
Ben
12-08-2017 11:35 AM - edited 12-08-2017 11:37 AM
@dhaugen92 wrote:
@altenbach wrote:
@dhaugen92 wrote:
I've tried instead, inserting the dump from the subVI into the already established array:
This is completely pointless and requires the exact same new memory allocation as the original code.
even if the array only has 3000 values but is initialized to a length of 10000 and im only inserting 1000 values?
"insert into array" grows an existing array and thus most often requires a new memory allocation for the entire thing. (Don't confuse it with "replace array subset" where the array size remains constant)
Do you get the same problem if you just write the file and don't zip it up.
12-08-2017 12:10 PM
@altenbach wrote:
@dhaugen92 wrote:
@altenbach wrote:
@dhaugen92 wrote:
I've tried instead, inserting the dump from the subVI into the already established array:
This is completely pointless and requires the exact same new memory allocation as the original code.
even if the array only has 3000 values but is initialized to a length of 10000 and im only inserting 1000 values?
"insert into array" grows an existing array and thus most often requires a new memory allocation for the entire thing. (Don't confuse it with "replace array subset" where the array size remains constant)
Do you get the same problem if you just write the file and don't zip it up.
oooh. I DID confuse it with 'replace array subset'...
Let me switch that and see what happens.
Haven't tried not zipping it... interesting proposal. If the first thing doesn't work, I'll try that too.
12-08-2017 12:11 PM
dhaugen92 wrote:if the array only has 3000 values but is initialized to a length of 10000 and im only inserting 1000 values?
Also don't forget that you have 2D array, so we need to know the sizes in both dimensions. If the sizes in the two dimensions of the two arrays to be merged are poorly matched, there could be a lot of padding and much larger outputs than you might think.
12-08-2017 01:28 PM
I second what altenbach just said about confirming both dimensions of the two 2D arrays. If one of them is 2x50000 and the other one is 50000x2, LabVIEW will *believe* you and try to make a 50000x50000 array to hold the result, padding it with lots and lots (and LOTS and lots) of zeros.
I had a similar memory error from reuse code back in the early days of DAQmx. In the old driver, rows represented one sample from all channels and columns represented all samples from one channel. DAQmx transposed that meaning, and some of my formerly proven-out reuse code suddenly started to generate these kind of out-of-memory errors.
-Kevin P