Try generating the data as a two dimensional array. For each iteration of the loop, just add to the array. If you know the iterations ahead of time, you can initalize an array to begin with and just replace array elements. This will divide the memory requirements into each iteration. Then, you can just save the data to file at the end.
Alternatively, if your loop iteration time allows, I would open a file, write to it in each iteration, then close it afterwards. This will save a lot of time and memory. You will need to open up and dissect the "Write to Spreadsheet File" in order to keep the file open for each loop iteration. I really recommend this method. Again, this can only be done if you can write the 512x512 array in the time it takes for your loop to ite
rate.
There are a couple of things you can do to make this possible if you have a fast loop time. One is to use pipelining. You have two parallel operations: One that acquires your data, and one that saves data. The current data is placed in a shift register. The data that is being saved is from the previous iteration. Since data acquisition runs in the background, you can run the saving operation at the same time. You MUST open the file before the loop starts, and close it after the loop ends. This will make your save operation much quicker, and you may find it works just as fast as your DAQ.
However; if your DAQ operation is quicker than your loop, then I recommend using a parallel loop, and just use a circular buffer to save the data. This way, at least most of the data is saved while you are acquiring it. Place the data into the parallel loop through a local variable. Use a circular buffer in your acquisition loop to ensure data is not lost or duplicated. LabVIEW has
a number of examples of circular buffers.
Good luck