LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How can I save a large 3d array efficiently?

Hi,

I've been having problems saving 512x512xN arrays. 512x512 arrays are generated within a loop and are indexed as they pass out of a loop structure.

For the past few weeks, I've been saving the 3d array using the reshape array function, so that i may save it as a 2D array (512xN)x512. And surer than ever, after the loop finishes execution, the reshape array function eats up memory beyond about N=100 and takes about 30 minutes to finish; and i'm running out of available memory too despite a generous lift of virtual memory.

I've attempted to save the data in smaller chunks of arrays within the loop, but the data generating parts must run at a consistent, fast speed at all costs - so that di
dn't seem like the solution.

Does anybody know of a better way to save the array?
0 Kudos
Message 1 of 5
(4,026 Views)
I don't know if you've got any experience with more conventional languages, but it might be worth trying using something like C++ to perform the array manipulation and store the data in the form you want.

You could try either constructing a CIN to do this directly or, else, have the LabVIEW program dump the data in a raw format (say, sequential 512x512 slices) and then using a post-processing program to do the "reshaping".

Without knowing what sort of system you're using it's difficult to know whether 30 minutes is unreasonable for the task of moving ~ 2^25 values around. It seems fairly reasonable to assume that you should be able to do so without running out of memory if you're able to deal with the values in their existing form though. Another issue is what type of arra
y you're dealing with - strings and other complex quantities are going to use a LOT of memory in arrays of that size.

It seems likely that you don't need to change the shape of the array and could bypass the duplication of the data by using a conventional language to output the data in the order you need rather than doing so.

Hope this helps,

Adam
0 Kudos
Message 2 of 5
(4,026 Views)
rbarrett writes:

> Hi,
>
> I've been having problems saving 512x512xN arrays. 512x512 arrays are
> generated within a loop and are indexed as they pass out of a loop
> structure.
>
> For the past few weeks, I've been saving the 3d array using the
> reshape array function, so that i may save it as a 2D array
> (512xN)x512. And surer than ever, after the loop finishes execution,
> the reshape array function eats up memory beyond about N=100 and takes
> about 30 minutes to finish; and i'm running out of available memory
> too despite a generous lift of virtual memory.
>
> I've attempted to save the data in smaller chunks of arrays within the
> loop, but the data generating parts must run at a consistent, fast
> speed at all costs - so that didn't se
em like the solution.
>
> Does anybody know of a better way to save the array?

Do all array points contain data? Otherwise you could use an 1D-array of
clusters for each array dimension, e. g. an 2D array would be an array
of clusters of arrays. It does not have all the zero's from indexing
and needs just the memory you really need. Overhead should be a little
bit more than an array.

HTH,

Johannes Niess
0 Kudos
Message 3 of 5
(4,026 Views)
Do you have to save the data in a specific format for a non-LabVIEW program to read? If not, then you might try using the "Write File" VI straight up. This VI accepts data in any format (you just have to tell "Read File" the same format to recover it). You might have to try this to see if it is any faster. I have been using this for saving 1D and 2D arrays in the same file.

Rob
Message 4 of 5
(4,026 Views)
Try generating the data as a two dimensional array. For each iteration of the loop, just add to the array. If you know the iterations ahead of time, you can initalize an array to begin with and just replace array elements. This will divide the memory requirements into each iteration. Then, you can just save the data to file at the end.

Alternatively, if your loop iteration time allows, I would open a file, write to it in each iteration, then close it afterwards. This will save a lot of time and memory. You will need to open up and dissect the "Write to Spreadsheet File" in order to keep the file open for each loop iteration. I really recommend this method. Again, this can only be done if you can write the 512x512 array in the time it takes for your loop to ite
rate.

There are a couple of things you can do to make this possible if you have a fast loop time. One is to use pipelining. You have two parallel operations: One that acquires your data, and one that saves data. The current data is placed in a shift register. The data that is being saved is from the previous iteration. Since data acquisition runs in the background, you can run the saving operation at the same time. You MUST open the file before the loop starts, and close it after the loop ends. This will make your save operation much quicker, and you may find it works just as fast as your DAQ.

However; if your DAQ operation is quicker than your loop, then I recommend using a parallel loop, and just use a circular buffer to save the data. This way, at least most of the data is saved while you are acquiring it. Place the data into the parallel loop through a local variable. Use a circular buffer in your acquisition loop to ensure data is not lost or duplicated. LabVIEW has
a number of examples of circular buffers.

Good luck
0 Kudos
Message 5 of 5
(4,026 Views)