02-20-2009 09:34 AM
Hi All,
I'm having a little problem with analysis of an extreme data (.txt) file of 756Mb big; ~65.000.000 datapoints
I have already devided the different sections of the programm in seperate VIs to limit the needed virtual memory of my computer (Samsung, Q70, Vista 32-bit, Core(TM)2 Duo CPU, T7300 @ 2.00gHz 2.00 GHz; 2.00GB RAM)
It's just a single analysis that I'm planning to do to determine (by using the Allan variable method) different sources of noise.
Only problem is that LabVIEW(8.5) and Vista keep sending the error that there's not enough memory to complete the operation.
Now I'm wondering if there is a little tweak to temporarily increase the virtual memory in order to do this analyses once... I do have access to a University server which I'm allowed to use for this special occasion, but i'm wondering if it's needed. VI is attached in it's orignal form, included a screenshot of the error in a empty VI just to see if loading the file and plotting it in a graph is possible...
Thanks
Solved! Go to Solution.
02-20-2009 11:20 PM
Your code is hard to follow. There were so many subVI's and none of them are executable on my system. Here are a few LabVIEW tips.
1. Don't use stacked sequence structures. The make wiring go in the wrong direction and hide code.
2. Using the Abort VI Stop sign function is a bad idea. It is the same as using the abort button on the toolbar. It crashes the code dead.
3. There is no reason to read from a control, and then immediately write to a local variable of that control.
Because I couldn't follow your code, I can't tell you exactly where your problem is. Are you reading in the entire data file at once into one large array? If so, any time that array is copied in memory, you are going to use another (8 x 65 million) bytes of memory. Do that just 2 or 3 times and you are going to run out of memory.
You will need to read and work with the data in smaller chunks. Read this article Managing Large Data Sets in LabVIEW
02-21-2009 04:05 AM
02-21-2009 07:35 AM
Hi friends
Thanku for sharing information with the member of this site.
Joseph
02-21-2009 08:06 AM
Many thanks Ravens Fan and GerdW!
I do understand your comments, and I totally agree with them on every aspect. It's just that even when I have a blank VI, apply a read txt file and a indicator of any sort, I still have the same message about not enough memory to complete the operation.
Therefore, my question still remains:
Is there a tweak, tip, cheating way to fool my computer to let it think it has more memory to complete some sot of analyses or even just PLOT the txt file on a graph or what so ever.
Am I able to read the txt file and do some sort of analyses on it in LabVIEW or do I have to run the program over a server to increase the memory?
Again thank you
02-21-2009 08:13 AM
You might consider using some OpenG-Tools. They have a package called 'Large File', which might be useful in your case. I havn't work with it, though.
Felix
02-21-2009 09:59 AM
Ravens Fan wrote:Your code is hard to follow. There were so many subVI's and none of them are executable on my system. Here are a few LabVIEW tips.
1. Don't use stacked sequence structures. The make wiring go in the wrong direction and hide code.
2. Using the Abort VI Stop sign function is a bad idea. It is the same as using the abort button on the toolbar. It crashes the code dead.
3. There is no reason to read from a control, and then immediately write to a local variable of that control.
Because I couldn't follow your code, I can't tell you exactly where your problem is. Are you reading in the entire data file at once into one large array? If so, any time that array is copied in memory, you are going to use another (8 x 65 million) bytes of memory. Do that just 2 or 3 times and you are going to run out of memory.
You will need to read and work with the data in smaller chunks. Read this article Managing Large Data Sets in LabVIEW
Ravens Fan,
You did tell "exactly where your problem is."
Mr. Hond,
You can see the issue by doing two things.
1) Right click on your stacked sequence structures and choose replace >>> Replace with flat sequence.
2) Select Tools >>> Profile >>> Show Buffer allocations. In the floating window select "arrays" and "Clusters" and click "refresh". Dots will flash everywhere a new buffer is allocated.
You should do this for your file writting sub VI as well. You will see that a lot of buffers are being created. LV has to operate within the limitiaions of the OS and that boils down to a memory footprint less than 2G theoretical, 1.2 G practical. An additional limitiaon is all buffer allocation must be contigous so memory fragmentaion can asert an additional limitaion.
How do you fix your code?
The game is called "chase the dots". The structure of your code must be modified to create less buffers. Search on "in-place" for tons of discusions on this aspect of LV beyond the link Ravens Fan provided. If I was doing this myself*, I'd put together a custom Action Engine that performed all data related tasks through an appropriate set of actions. The Action Engine would use Shift Registers for your data and its analysis and I'd do my damnedest to make sure the SRs are the only buffers used in the whole app.
Have fun!
Ben
*I have faced this challenge myself and an Action Engine was the solution.
02-22-2009 10:11 AM
found it + fixed it = it's working!
Thanks for everyone who replied!
Many thanks,
Patrick
02-23-2009 02:59 PM
MrHond wrote:found it + fixed it = it's working!
Thanks for everyone who replied!
Many thanks,
Patrick
You are welcome, and thank you for the Kudos here and elsewhere.
Ben