LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Labview crashes due to memory issue

Hi All,

 

My labview application is querying around 8000000 records from database and i am converting into 2d array of strings. When i am trying to process the 2d array , the RAM usage goes to 2.8gb and labview crashes saying Not enough memory to handle the operation.

 

Can you please suggest how i can debug this issue and any tool which helps me in analyzing the memory allocation and de-allocation during run time (suggest if any static tool to know memory foot print). 

 

Please suggest any guidelines when we are handling this much of data and also how to use array functions efficiently for performance and memory usage?

 

Do we have any third party tool for memory analysis if NI don't have that.

 

Thanks,

Anand

 

 

 

0 Kudos
Message 1 of 8
(2,536 Views)

I assume you are using 32bit LabVIEW.

 

How long are the strings? What kind of "processing"? What are the two dimensions? Why can't you process each string immediately or in smaller chunks? What's the final data you want in memory and why does it all need to be in memory?

0 Kudos
Message 2 of 8
(2,526 Views)

HI Alten,

 

You are right .I am using Labview 32.

 

The lenghth of the string is arounf 15- 20 characters.(Dimension 8000000*6).The processing is basically retrieve the particular column and modify it based on some condition and then replace it.(Similar for other column and rows values).

 

What's the final data you want in memory and why does it all need to be in memory?

How can i achieve this . How to bring small chunk of data to memory and remaining in disk. Then move the processed data to disk and pull the unprocessed data to ram.(Can you explain in terms of how to code)

 

Regards,

Anand

0 Kudos
Message 3 of 8
(2,445 Views)

Maybe a time delay 50ms or 10ms is needed to put in the VI.

/mctnnn

0 Kudos
Message 4 of 8
(2,439 Views)

@mctnnn wrote:

Maybe a time delay 50ms or 10ms is needed to put in the VI.


That makes no sense. Why would a time delay change the total memory usage?

 

(Yes, if you really slow things down it will take longer for the memory to fill up, but it will still fill up eventually 😄 Reading 8M items with a 50ms delay, will take almost a work week!!!)

0 Kudos
Message 5 of 8
(2,429 Views)

@AnandR wrote:

What's the final data you want in memory and why does it all need to be in memory?

How can i achieve this . How to bring small chunk of data to memory and remaining in disk. Then move the processed data to disk and pull the unprocessed data to ram.(Can you explain in terms of how to code)


Use a loop.  Inside the loop you read a chunk of the row, for example 1000 rows.  You then process that 1000 rows and do whatever with the results.  So each iteration of the loop will read, process, and then do something with the results.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 6 of 8
(2,422 Views)

@AnandR wrote:

 

The lenghth of the string is arounf 15- 20 characters.(Dimension 8000000*6).The processing is basically retrieve the particular column and modify it based on some condition and then replace it.(Similar for other column and rows values).

 


You have gigantic data structures, but your description is way too vague to give any advice. What kind of "Processing" is needed? Does the result depend on the entire columns or only on some local values?

 

Show us some typical data and how it needs to be processed. The final solution approach strongly depends on what needs to be done.

0 Kudos
Message 7 of 8
(2,386 Views)

**The lenghth of the string is arounf 15- 20 characters.(Dimension 8000000*6).The processing is basically retrieve the particular column and modify it based on some condition and then replace it.(Similar for other column and rows values).**

 

If your processing step is in regards to a particular column, why are you storing the other columns in memory?  Are they required?  Modify your initial database query to only return columns that you actually need.  This will reduce memory required for your LabVIEW application to process the data, since you aren't holding on to useless data.

 

Like Altenbach has been saying, you are far too vague to elicit any practical advice, we can only give general tips at this time.  Another route similar to Crossrulz's suggestion, is simply "don't query ALL data at once"... use a for loop to query chunks of the database, process, save, repeat....  This increases the number of database interactions, but significantly reduces memory usage.

0 Kudos
Message 8 of 8
(2,367 Views)