01-28-2008 08:03 AM
02-24-2008 01:08 PM
09-19-2010 08:16 AM
Greetings,
I have a somewhat similar problem, but I don't believe it can be solved in this manner. I am trying to concatenate a bunch of very large SDF files (essentially text files). I can do this via read from text, concatenate strings, and save. However this only works for about 8 files since they are each over 120 mb and my system runs out of memory. I have about 1000 of these to do. I would like to do them all at once, but I can try to do 100 at a time, much less and I am kind of defeating the purpose.
Can someone look at my code and give me some pointers? My thought has been that I need to somehow pass the end of file to the concatenate function without loading it all to memory. I have run out of ideas.
Many thanks.
09-20-2010 01:11 AM
Read your files in a loop and append their content to the existing target file on disk. No need to concatenate in memory.
09-20-2010 01:43 AM - edited 09-20-2010 01:44 AM
here is a quick draft of what I had in mind (LabVIEW 2010).
See if it works for you.
(Your code was a bit convoluted, with hidden indicators serving as unneeded local variables. None of those are needed. Also the execution order is determined by the data dependency, so no need for a sequence structure. I also don't understand the need for the big case structure. Are you using continuous run?)