From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
01-25-2017 08:15 AM
Hey,
Hopefully someone can help me with my problem, I cant seem to find a proper solution. I have a very big .txt file with pixel values (7digit), one value per row, 170.000² rows. I want to create an array using these values, the dimensions have to be 170.000x170.000.
I want to export my array to excel, as a table or to a Bitsream Data, to further convert it to a picture or heatmap.
The Problem is, that the array overloads my RAM and I can't seem to find a proper solution to split the storage needs, because the "read from txt" vi doesn't allow me to set an offset, it would always start reading at the beginning.
I hope I specified my probelm good enough, thanks for any help.
Solved! Go to Solution.
01-25-2017 08:21 AM
Short reply, long road ahead of you (I hope I am wrong and learn an alternative)
You will have to toss using the "read from Text file" since it will try to read the entire file all at once.
I ended up reading the data one record at a time and pushed it to queues for analysis and also to file.
Be ware that the normal length for reading stuff from a text file is limted to an I32 so you will have to "roll-your-own" file IO and use and I64 for the byte offest.
Ben
01-25-2017 08:26 AM
Is that 170 thousand?.
You have functions in the file I/O palette that lets you set the read position with a file.
If you open your file, and start reading from the beginning without closing it and read that in chunks, the next read will pick up where the last read left off and you don't have to worry about setting the file position.
But if you are trying to create a 170,000 by 170,000 element array, you will never get all of that in memory at once. That is 28.9 billion elements. So even if each element is only a U8, you are nearly 29 GB.
You are talking about sending data to Excel. My experience is that Excel starts to really have problems with large datasets, particular when you start graphing them. I would forget about Excel.
01-25-2017 08:27 AM
You should open the file before the loop and then read parts at a time inside of a loop. Then each read will start where the previous left off. Close the file when you are done.
01-25-2017 06:13 PM
I think you do need to specify your problem more, or we're in danger of answering your technical question but not addressing what you actually need to do. Where have these values come from? Are they pixels that make up a single 170000x170000 image? What resolution are they? 7-digit doesn't really mean anything - can they be represented as 8-bit or are they floating-point? And what is the end result that you want to do with all this data?
Apart from those, I agree with the other comments that Excel will definitely not cope with this, and that you probably won't be able to create a single array of this dimension.
01-25-2017 06:58 PM
How big is your text file actually?
If this is really a picture, even printed at 300dpi, would cover a building. Who made that file?
Why excel? That's probably the last thing that would come to mind here.