LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Maximum memory LabVIEW can use

What is the maximum memory that LabVIEW can use. When I run my code it crashes when LabVIEW's memory is aroung 1GB. I'm using 3GB RAM.

 

I'm handling IMAQ images of the size 32MB each. I append one image on top of another, while doing so when I append 4 images LabVIEW crashes.

 

Is there any limit on the maximum size of continuous memory that LabVIEW can use.

 

 

Thanks 

0 Kudos
Message 1 of 13
(5,334 Views)

Have a look at this knowledge base article. http://digital.ni.com/public.nsf/allkb/AC9AD7E5FD3769C086256B41007685FA

 

And this help article on managing large datasets... Memory Management for Large Data Sets - NI LabVIEW 8.6 Help

Troy - CLD "If a hammer is the only tool you have, everything starts to look like a nail." ~ Maslow/Kaplan - Law of the instrument
0 Kudos
Message 2 of 13
(5,323 Views)

The major reason for your application to fail is memory fragmentation. Due to internal memory management of LabVIEW and the operating system (Windows i assume), there is always fragmentation of memory within the LV application. This fragmentation leads to the issue, that overall you probably can go up to 3GB in an 32bit OS, but you cannot allocate an array of 2GB!

The largest array you can allocate depends on the system resources. Using the /3GB-option, you can allocate arrays > 1GB. But only one!

Since you are working with images (which are essentially arrays as well), you have to reduce the amount of images you are concurrently working on (i am counting copies of an image as individual image here!).  The link Troy provided is quite good to start with.

 

Considering your post, the appended image needs about 128MB plus possible additional overhead. So i assume that your primary issue is either trhe amount of images/copies or the 32MB are not true for the image if it is in RAM.....

 

One additional question from my side: What do you mean with "LV crashes"? Does it display an error? Does the OS report an "out of memory" situation? 

 

hope this helps,

Norbert 

Norbert
----------------------------------------------------------------------------------------------------
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.
0 Kudos
Message 3 of 13
(5,305 Views)

Norbert B wrote:

The major reason for your application to fail is memory fragmentation. Due to internal memory management of LabVIEW and the operating system (Windows i assume), there is always fragmentation of memory within the LV application. This fragmentation leads to the issue, that overall you probably can go up to 3GB in an 32bit OS, but you cannot allocate an array of 2GB!

The largest array you can allocate depends on the system resources. Using the /3GB-option, you can allocate arrays > 1GB. But only one!

Since you are working with images (which are essentially arrays as well), you have to reduce the amount of images you are concurrently working on (i am counting copies of an image as individual image here!).  The link Troy provided is quite good to start with.

 

Considering your post, the appended image needs about 128MB plus possible additional overhead. So i assume that your primary issue is either trhe amount of images/copies or the 32MB are not true for the image if it is in RAM.....

 

One additional question from my side: What do you mean with "LV crashes"? Does it display an error? Does the OS report an "out of memory" situation? 

 

hope this helps,

Norbert 

Hi Norbert,
I'm not able to follow your point in the highlighted part. LV crashes in the sense,LV displays error for not enough memory.
Thanks

 

0 Kudos
Message 4 of 13
(5,301 Views)

Fragmented memory has junks of memory for arrays. E.g. 400MB, 200MB, 100MB, 100MB, 80MB, 60MB, 60MB, ...

If you allocate an array, the chance is quite high that it will allocate the array within the biggest junk first. This will of course reduce the size of the junk. So basically, you can add up the used memory up to the amount of all memory available for the process LabVIEW.exe, but not if you try to allocate arrays which are bigger than the largest current available junk.

 

Norbert 

Norbert
----------------------------------------------------------------------------------------------------
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.
0 Kudos
Message 5 of 13
(5,293 Views)

Are the images you're referring to compressed or RAW? Where do get the 32MB figure from?

 

Have you tried profiling the memory usage of the vi to see when all the memory is being allocated.

Put in a bunch of break points and refresh the memory profile display after each break point is reached to narrow down where all the memory is going.

Troy - CLD "If a hammer is the only tool you have, everything starts to look like a nail." ~ Maslow/Kaplan - Law of the instrument
0 Kudos
Message 6 of 13
(5,238 Views)

I am running into the same issue.  What bothers me is LV is the one that is utilizing the memory or doing the management.  My situation is I am running at about 1.3 GB of memory.  I have 4GB in the machine, so there is plenty of physical memory available.  I am doing an imaging application and I am trying to allocate around a 100MB image buffer.  Now if LabVIEW really requests contiguous memory for its images then my memory shouldn't be that fragemented I would think, since most of the memory being utilized was from other Image buffers.

 

I have tried playing with boot switches but no luck.  Is anyone aware of a utility that will show me my memory fragmentation?

 

John

0 Kudos
Message 7 of 13
(5,120 Views)

Hi John,

 

the fact that you have 4GB installed in the machine actually has nothing to do with your out of memory problem. It tries to allocate memory in the virtual address space of the process, which has only 2GB (3GB with the correct boot switch). That address space is fragmented and so you do not have enough contiguous free space. (Lots of RAM helps with multitasking, not for a single program necessarily).

 

You can use a tool from Microsoft (SysInternals) called VMMap. It can be downloaded for free and shows the virtual address space of a process, as well as the size of free chunks.

 

You can also use a library I created for LV2009 that automatically creates several smaller arrays (to avoid fragmentation issues) and presents them to the user of the API as a single very large array. You can find it here:

 

Fragmented Array Library

http://decibel.ni.com/content/docs/DOC-9321

 

Hope this helps.

0 Kudos
Message 8 of 13
(4,959 Views)

Hello,

 

I stumbled on this thread while looking for similar problems. I have to read 3 images of type Int16 with a size of over 70 megapixels. How can I do this? It manages to make 2 reads, but on the third it says that it doesn't have enough memory available to do that and just doesnt read an image.

 

The 3GB virtual memory thing does not help in this case. And total LV memory usage is at 730MB and each read adds about 280MB, so it doesn't feel like it's impossible, to me..

 

Does anyone have experience with these kind of very large image sizes?

 

Thanks

0 Kudos
Message 9 of 13
(4,731 Views)

Some tricks I have found help: (sorry to state the obvious)

 

1.  Run MSConfig.exe and only load services and startup programs that need to be there. The amount of bloat on a machine is amazing!

2.  Compile the application and run as an execuatble.

3.  Minimize the amout of "other" memory in the application.

4.  Be aware of what the IMAQ functions are doing.  I have found that sometimes they make copies when you don't expect or have "temporary" buffers.

5.  Try to pre-allocate all buffers.  I have solved many an issue with that.  For example, if you need to append images allocate a single image to the maximum append size.  Then use Image2Image to insert into this buffer. You can use the ROI tools/inputs to only work on the portion of the image you need to.  The problem with pre-allocating is it makes your code less flexible because you cannot dynamically allocate as needed.

 

The only true solution is to go to x64.  I am working on a project where we need to process a 92GB image.  I will update this post when we have a chance to try that.

0 Kudos
Message 10 of 13
(4,718 Views)