LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Memory for large string arrays

What kind of memory requirement does labview have for using large string arrays.

 

I'm having troubles just initalizing my 5000x4000 array. I get an out of memory error. Is there a more efficent data type I should look into using?

0 Kudos
Message 1 of 9
(3,830 Views)

@markog wrote:

What kind of memory requirement does labview have for using large string arrays.


The same as every other programming language.

 


 I'm having troubles just initalizing my 5000x4000 array. I get an out of memory error. Is there a more efficent data type I should look into using?

For an array you need a contiguous amount of storage. Thus, it makes no difference if you see 2GB of free memory - if it's fragmented, then LabVIEW (or any other programming language) will not be able to allocate the required memory. A 5000x4000 array of strings is a LOT of memory. Even assuming you had only one character in each string, that's almost 20MB of memory.

 

As for good alternatives, well, you'd need to provide more details as to what you're doing with this data before a good suggestion can be provided. I could, for example, tell you to use a queue, but without knowing what you actually need, that may be the wrong suggestion.

Message 2 of 9
(3,823 Views)

Basically I'm taking a 3x4000 (this number will go up) array of dbl's replaceing the first few lines with descriptions and tokens then stacking those end to end until I get 2x 5000x4000 arrays as an output. Then I'm saving that to an excel file.

 

If I do make it thougth the program I usually run out of memory at the report generation tool kit. If I manage to make it past the conversion to variant it usually crashes out due to "out of memory" before the save.

0 Kudos
Message 3 of 9
(3,812 Views)

One suggestion I have here is whenever you read the data just write it to a .bin file. This file you can keep it for temporary storage and have only the size of the data that you are reading in program and once you have got the data that you needed you can read from the file and write it to an excel sheet through report generation toolkit. Since you are writing to a file you will not load your virtual memory so no more memory problem will arise.

-----

The best solution is the one you find it by yourself
0 Kudos
Message 4 of 9
(3,808 Views)

Right now I'm using a producer -> consumer style VI's where the data is made in one VI then wired to another, the concept was that labview would recover the data in the producer subvi. But I still crash out at the string to variant conversion. I don't see how an intermediate .bin file would help there unless labview reads directly from the disk and does not load to ram.

 

Note I get the same error when writting to TDM files.

0 Kudos
Message 5 of 9
(3,803 Views)

I dont think you have error while writing to a while the error may happen only when you try to hold a large amount of data and even in one point if you are trying to duplicate it (Use of Local or global variable) then its tough to handle it.

 


@markog wrote:

 But I still crash out at the string to variant conversion.


What is the specific use of String to Variant in your application?

-----

The best solution is the one you find it by yourself
0 Kudos
Message 6 of 9
(3,800 Views)

@markog wrote:

Basically I'm taking a 3x4000 (this number will go up) array of dbl's replaceing the first few lines with descriptions and tokens then stacking those end to end until I get 2x 5000x4000 arrays as an output. Then I'm saving that to an excel file.


You need to better explain what you are doing, or just show us some code. I don't understand what you mean by replacing the first few lines with descriptions. Lines of what? You said you have an array of DBLs. Are you converting this to an array of strings? If so, how. Also, why are you "stacking" them up? Why can't you write the chunks as you generate them? Your method is simply untenable in terms of memory requirement.

Message 7 of 9
(3,782 Views)

"Basically I'm taking a 3x4000 (this number will go up) array of dbl's replaceing the first few lines with descriptions and tokens then stacking those end to end until I get 2x 5000x4000 arrays as an output. Then I'm saving that to an excel file."

 

Since you are starting with smaller arrays, can you simply write to the excel file when you get to, say, 500x4000??  Then you can re-use the array for the next collection.  This should reduce the memory requirements for labview since you only have 1/10th of the data loaded in memory at a time.  Of course this is assuming that you are ok with delaying your data collection long enough to append data to the excel file.

 

While I know you are looking for a software solution, another choice is to increase the memory of your computer.  With more memory available, you are more likely to have enough memory for your large arrays.  We recently had to do this for one of our experiements, which consists of over 5 billion data points and the memory increase was more than enough to fix our problems.

0 Kudos
Message 8 of 9
(3,781 Views)

@pjr1121 wrote:

Since you are starting with smaller arrays, can you simply write to the excel file when you get to, say, 500x4000??  Then you can re-use the array for the next collection.  This should reduce the memory requirements for labview since you only have 1/10th of the data loaded in memory at a time.  Of course this is assuming that you are ok with delaying your data collection long enough to append data to the excel file.


This means I wont be able to use the Report Generation Tool kit but it should get the job done. This makes sense, I'll try to save incrementally. I'll update after I give this a shot.

0 Kudos
Message 9 of 9
(3,770 Views)