LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Not enough memory

Solved!
Go to solution

Hello I have problem with my vi, after I run my vi for 24 hours it's tell me I don't have Not enough memory to complete this operation.

 

I check my code I a saw my sensor VI is making my RAM to rise

 

when I put delay of 500ms in the sensor VI the RAM is still rise but I get the massage after 40 hours.

 

what can I do to fix it, and what I miss here?

 

thanks

0 Kudos
Message 1 of 11
(2,780 Views)

That VI cannot for 24 hours since it does not have a loop. The only way it can run more than once is to use the Run Continuous button and that should never be used except for special debug reasons. Put a while loop around your code so that you are not constantly creating a task. Then think about getting rid of the DAQ Assistant so that you only have a DAQmx Read inside the loop.

0 Kudos
Message 2 of 11
(2,777 Views)

Sorry maybe I didn't explain my self

 

this is just my subVI.

In my main VI I have a while loop that every time enter the sensor  VI.

 


What is the problem with the DAQ Assistant?

Way DAQmx Read is better?

0 Kudos
Message 3 of 11
(2,769 Views)

The problem with the DAQ Assistant is that it keeps recreating a task each time it is called.  You should setup your task once at the beginning (before the main loop), read inside of the loop (as much as you want), and then close out the task at shutdown (after the main loop).  This will help with some memory leaks.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 4 of 11
(2,761 Views)
Solution
Accepted by topic author Johnny1986

You have more control and less overhead with the lower level functions. The DAQmx Read for digital ports is trivial. In the main VI, have the DAQmx Create Channel and Start Task. Pass the task to the subVI with just a DAQmx Read. When the main VI finishes, call DAQmx Clear task.

 

You could also have a problem with memory useage in the main VI. Since you did not attach that, impossible to say.

0 Kudos
Message 5 of 11
(2,759 Views)

Thanks

 

my main vi is OK, when I disable the sensor Sub vi the Ram stop raised

0 Kudos
Message 6 of 11
(2,752 Views)

Hi, I am also receiving a "Not enough memory" error so I thought I would join this conversation instead of starting a new thread. I have used LabVIEW a couple times in the past, but am definitely still a rookie. I am trying to read a signal from a microphone through an NI9233 and write the data to a text file. However, I cannot even record 10 minutes of data without receiving the "Not enough memory" error. I am using a DAQ Assistant, Append Signals Express VI, and shift register inside a while loop to compile my data into a single signal. The DAQ Assistant sampling rate is 50 kHz. I have attached my VI and an image of the error I am receiving (I received the same error for NoiseTest2 as for NoiseTest1).

I know crossrulz said that DAQ Assistant recreates the task each time. I will try using DAQmx commands next. But I still want to know if there is anything else blaringly wrong with my VI, because it seems like DAQ Assistant should be able to handle a short timeframe of data sampling.

Also, I am using an evaluation version of LabVIEW 2012, so I am wondering if that could greatly hamper the memory capabilities.

Thanks,
Alex

Download All
0 Kudos
Message 7 of 11
(2,589 Views)

You are reading 10,000 samples at a rate of 50kHz.  So 50k samples per second.  After 10 minutes that is 30 million samples.  The dynamic datatype hides some details, but I'll assume the underlying data is double precision, so 8 bytes per sampel.  Now you are talking 240 million bytes.  That is a pretty good amount.  The dynamic datatype probably adds a lot of additional bytes as overhead.

 

You have two problems with your code.  You keep appending the data to each other continually growing that dynamic data.  And since it is on an uninitialized shift register, any data in that shift register from earlier runs of your VI is tacked on as well.

 

The only reason you are accumulating this data is to write it out to a file in one big swoop once your loop ends (assuming you haven't already run out of memory.)

 

You need to use the producer/consumer architecture to pass the data off to a file writing consumer loop, and not accumulate all of that data in the DDT and the shift register.

 

In addition, you could possibly have the same problem as the original poster in this message using the DAQ assistant which might be opening and closing the DAQ task repeatedly in the loop.  However, with only 10 minutes of runtime, that makes me believe your first problem is what I stated earlier.

0 Kudos
Message 8 of 11
(2,585 Views)

Alex,

 

First, do the math. 10 minutes*60 seconds*50 kHz*8bytes/sample = 240 MB.

 

Next. How many copies of that data do you have in memory? The shift register with the Append Signals VI will be the killer. Arrays in LabVIEW must reside in contiguous memory locations.  Each time a new signal is appended, the array gets bigger. New memory must be allocated for the larger array. Most likely long before you have used up all available memory, there will not be a contiguous block large enough for the new array and you get the message.

 

There is information in the help file and on the Forums about managing large datasets.

 

Most likely you should write the data to the file in smaller blocks. Either write every time you read the DAQ Assistant or accumulate some reasonable amount of data, write that and clear the buffer.  It is better to pre-allocate memory and re-use it.  The Append Signals VI does not appear to make any attempt to do this, so you would need to get rid of the Dynamic Data Type and work with the arrays directly. Initialize Array and Replace Array Subset will become your new best friends.

 

Lynn

0 Kudos
Message 9 of 11
(2,583 Views)

I'm in full agreement with RavensFan.  You should be using a Producer/Consumer architecture.  The idea is to put the file write into another loop and use a queue to send the data to that loop to write to the file.  This way you don't have to keep adding on to a huge array and the write process is done in parallel.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 10 of 11
(2,553 Views)