LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Expanding Labview memory

Solved!
Go to solution

Hi All,

 

I'm having a little problem with analysis of an extreme data (.txt) file of 756Mb big; ~65.000.000 datapoints

 

I have already devided the different sections of the programm in seperate VIs to limit the needed virtual memory of my computer (Samsung, Q70, Vista 32-bit,  Core(TM)2 Duo CPU, T7300 @ 2.00gHz 2.00 GHz; 2.00GB RAM)

 

It's just a single analysis that I'm planning to do to determine (by using the Allan variable method) different sources of noise.

 

Only problem is that LabVIEW(8.5) and Vista keep sending the error that there's not enough memory to complete the operation.

 

Now I'm wondering if there is a little tweak to temporarily increase the virtual memory in order to do this analyses once... I do have access to a University server which I'm allowed to use for this special occasion, but i'm wondering if it's needed. VI is attached in it's orignal form, included a screenshot of the error in a empty VI just to see if loading the file and plotting it in a graph is possible...

 

Thanks 

 

Download All
0 Kudos
Message 1 of 9
(3,571 Views)

Your code is hard to follow.  There were so many subVI's and none of them are executable on my system.  Here are a few LabVIEW tips.

 

1.  Don't use stacked sequence structures.  The make wiring go in the wrong direction and hide code.

2.  Using the Abort VI Stop sign function is a bad idea.  It is the same as using the abort button on the toolbar.  It crashes the code dead.

3.  There is no reason to read from a control, and then immediately write to a local variable of that control.

 

Because I couldn't follow your code, I can't tell you exactly where your problem is.  Are you reading in the entire data file at once into one large array?  If so, any time that array is copied in memory, you are going to use another (8 x 65 million) bytes of memory.  Do that just 2 or 3 times and you are going to run out of memory.

 

You will need to read and work with the data in smaller chunks.  Read this article  Managing Large Data Sets in LabVIEW

Message 2 of 9
(3,543 Views)

Hi MrHond,

 

additionally to Ravens comments I would advise not to save vi.lib vis to a llb!

It's quite annoying to have other versions of standard functions in memory when opening such a llb!

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 3 of 9
(3,531 Views)

Hi friends

 

Thanku for sharing information with the member of this site.

 

Joseph

 

0 Kudos
Message 4 of 9
(3,522 Views)

Many thanks Ravens Fan and GerdW!

 

I do understand your comments, and I totally agree with them on every aspect. It's just that even when I have a blank VI, apply a read txt file and a indicator of any sort, I still have the same message about not enough memory to complete the operation.

 

Therefore, my question still remains:

 

Is there a tweak, tip, cheating way to fool my computer to let it think it has more memory to complete some sot of analyses or even just PLOT the txt file on a graph or what so ever.

Am I able to read the txt file and do some sort of analyses on it in LabVIEW or do I have to run the program over a server to increase the memory?

 

Again thank you

0 Kudos
Message 5 of 9
(3,515 Views)

You might consider using some OpenG-Tools. They have a package called 'Large File', which might be useful in your case. I havn't work with it, though.

 

Felix

0 Kudos
Message 6 of 9
(3,512 Views)
Solution
Accepted by Paddy1985

Ravens Fan wrote:

Your code is hard to follow.  There were so many subVI's and none of them are executable on my system.  Here are a few LabVIEW tips.

 

1.  Don't use stacked sequence structures.  The make wiring go in the wrong direction and hide code.

2.  Using the Abort VI Stop sign function is a bad idea.  It is the same as using the abort button on the toolbar.  It crashes the code dead.

3.  There is no reason to read from a control, and then immediately write to a local variable of that control.

 

Because I couldn't follow your code, I can't tell you exactly where your problem is.  Are you reading in the entire data file at once into one large array?  If so, any time that array is copied in memory, you are going to use another (8 x 65 million) bytes of memory.  Do that just 2 or 3 times and you are going to run out of memory.

 

You will need to read and work with the data in smaller chunks.  Read this article  Managing Large Data Sets in LabVIEW


Ravens Fan,

 

You did tell "exactly where your problem is." Smiley Wink

 

Mr. Hond,

 

You can see the issue by doing two things.

 

1) Right click on your stacked sequence structures and choose replace >>> Replace with flat sequence.

 

2) Select Tools >>> Profile >>> Show Buffer allocations. In the floating window select "arrays" and "Clusters" and click "refresh". Dots will flash everywhere a new buffer is allocated.

 

You should do this for your file writting sub VI as well. You will see that a lot of buffers are being created. LV has to operate within the limitiaions of the OS and that boils down to a memory footprint less than 2G theoretical, 1.2 G practical. An additional limitiaon is all buffer allocation must be contigous so memory fragmentaion can asert an additional limitaion.

 

How do you fix your code?

 

The game is called "chase the dots". The structure of your code must be modified to create less buffers. Search on "in-place" for tons of discusions on this aspect of LV beyond the link Ravens Fan provided. If I was doing this myself*, I'd put together a custom Action Engine that performed all data related tasks through an appropriate set of actions. The Action Engine would use Shift Registers for your data and its analysis and I'd do my damnedest to make sure the SRs are the only buffers used in the whole app.

 

Have fun!

 

Ben

 

*I have faced this challenge myself and an Action Engine was the solution. Smiley Happy

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 7 of 9
(3,501 Views)

found it + fixed it = it's working!

 

Thanks for everyone who replied!

 

Many thanks,

 

Patrick

0 Kudos
Message 8 of 9
(3,448 Views)

MrHond wrote:

found it + fixed it = it's working!

 

Thanks for everyone who replied!

 

Many thanks,

 

Patrick


 

You are welcome, and thank you for the Kudos here and elsewhere.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 9 of 9
(3,411 Views)