From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

A huge set of values to array

Solved!
Go to solution

Hey,

Hopefully someone can help me with my problem, I cant seem to find a proper solution. I have a very big .txt file with pixel values (7digit), one value per row, 170.000² rows. I want to create an array using these values, the dimensions have to be 170.000x170.000.

I want to export my array to excel, as a table or to a Bitsream Data, to further convert it to a picture or heatmap.

 

The Problem is, that the array overloads my RAM and I can't seem to find a proper solution to split the storage needs, because the "read from txt" vi doesn't allow me to set an offset, it would always start reading at the beginning.

 

I hope I specified my probelm good enough, thanks for any help.

0 Kudos
Message 1 of 6
(2,858 Views)

Short reply, long road ahead of you (I hope I am wrong and learn an alternative)

 

You will have to toss using the "read from Text file" since it will try to read the entire file all at once.

 

I ended up reading the data one record at a time and pushed it to queues for analysis and also to file.

 

Be ware that the normal length for reading stuff from a text file is limted to an I32 so you will have to "roll-your-own" file IO and use and I64 for the byte offest.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 2 of 6
(2,850 Views)
Solution
Accepted by topic author nbo2058

Is that 170 thousand?.

 

You have functions in the file I/O palette that lets you set the read position with a file.

 

If you open your file, and start reading from the beginning without closing it and read that in chunks, the next read will pick up where the last read left off and you don't have to worry about setting the file position.

 

But if you are trying to create a 170,000 by 170,000 element array, you will never get all of that in memory at once.  That is 28.9 billion elements.  So even if each element is only a U8, you are nearly 29 GB.

 

You are talking about sending data to Excel.  My experience is that Excel starts to really have problems with large datasets, particular when you start graphing them.  I would forget about Excel.

 

 

Message 3 of 6
(2,844 Views)

You should open the file before the loop and then read parts at a time inside of a loop.  Then each read will start where the previous left off.  Close the file when you are done.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 4 of 6
(2,842 Views)

I think you do need to specify your problem more, or we're in danger of answering your technical question but not addressing what you actually need to do.  Where have these values come from?  Are they pixels that make up a single 170000x170000 image?  What resolution are they?  7-digit doesn't really mean anything - can they be represented as 8-bit or are they floating-point? And what is the end result that you want to do with all this data?

 

Apart from those, I agree with the other comments that Excel will definitely not cope with this, and that you probably won't be able to create a single array of this dimension.

0 Kudos
Message 5 of 6
(2,759 Views)

How big is your text file actually?

If this is really a picture, even printed at 300dpi, would cover a building. Who made that file?

Why excel? That's probably the last thing that would come to mind here.

0 Kudos
Message 6 of 6
(2,756 Views)