05-12-2009 02:19 PM
Hi there, i'm having some memory issues with this little program.
What i'm trying to do is reading a .csv file of 215 mb (6 million lines more or less), extracting the x-y values as 1d array and displaying them in 2 xy graphs (vi attacked).
I've noticed that this process eats from 1.6 to 2 gb of ram and the 2 x-y graphs, as soon as they are loaded (2 minutes more or less) are really realy slow to move with the scrollbar.
My question is: Is there a way for use less memory resources and make the graphs move smoother ?
Thanks in advance,
Ierman Gert
05-12-2009 02:21 PM
05-12-2009 02:52 PM
Thanks for the reply Matt.
I've read somewhere that splitting the array while loading it, can reduce the memory usage, but i have no idea on how to implement it.
05-12-2009 03:02 PM - edited 05-12-2009 03:02 PM
Hi lerman,
how many datapoints do you need to handle? How many do you display on the graphs?
Some notes:
- Each graph has its own data buffer. So all data wired to the graph will be buffered again in memory. When wiring a (big) 1d array to the graph a copy will be made in memory. And you mentioned 2 graphs...
- load the array in parts: read a number of lines, parse them to arrays as before (maybe using "spreadsheet string to array"?), finally append the parts to build the big array (may lead to memory problems too).
- avoid datacopies when handling big arrays. You can show buffer creation using menu->tools->advanced->show buffer allocation
- use SGL instead of DBL when possible...
05-12-2009 03:09 PM
Do you really want to try to show 6 million or so data points on a graph at once? You can only show so much do to the limited number of on screen pixels. Initialize an array to a much more reasonable number of points like a couple thousand. Read the data in from the file in portions and do a replace subset on once out of very so many lines (like one in a thousand). Now you'll have a smaller array you can pass to the graphs.