03-22-2017 07:29 PM
I am trying to write a large buffer of 100 million data points to DAQmx and then output it on the AO of the NI USB 6001 Module using DAQmx in Finite Samples Mode. I have realized that if I interrupt the task by stopping and clearing the task, DAQmx does not release the memory; therefore, if I try the second time to start the task and write a new set of 100 million data points to the AO, I get the following error "Not enough memory to complete this operation".
If someone can explain why the memory is not released after the task is cleared and if there is anyway I can force DAQmx to release the memory after I clear the task.
03-23-2017 07:50 AM
Please post your VI. The issue most likely occurs in your own code caused by memory fragmentation.
Also please verify that you are using 32bit LV. In that environment, allocating arrays with a size of 800MB is a challenge (depending on your coding).
03-23-2017 10:36 AM
Dunno if this works for you, but it is probably way better to stream the data from the disk than to load all 100,000,000 points at the beginning. The 6001 seems to have a 2000 point FIFO fo AO tasks, so I dunno how your VI works, but streaming is being done already from the software to the FIFO already. So you just need to stream from disk to software to fifo and you'll never have memory issues.
03-23-2017 10:46 AM
@pegahm wrote:
I am trying to write a large buffer of 100 million data points to DAQmx and then output it on the AO of the NI USB 6001 Module using DAQmx in Finite Samples Mode. I have realized that if I interrupt the task by stopping and clearing the task, DAQmx does not release the memory; therefore, if I try the second time to start the task and write a new set of 100 million data points to the AO, I get the following error "Not enough memory to complete this operation".
If someone can explain why the memory is not released after the task is cleared and if there is anyway I can force DAQmx to release the memory after I clear the task.
Have you tried to skip killing the task and just re-using it?
Ben
03-23-2017 12:31 PM - edited 03-23-2017 12:34 PM
I have attached three cases. I have two tasks one is the AO Task and the second one is just a counter task. The problem is with regards to the AO Task. The tasks are initiated in the Init Tasks case proceeding to the next case Start Tasks where tasks are started. Upon user request the Stop Tasks case is executed to stop and clear the tasks. If user decides to start another task to output another round of data to AO, these steps are repeated in the same order.
When I monitor LabVIEW memory usage through Task Manger I see that after I stop the task, the memory used by LabVIEW is not brought back to what is was prior to the first call to output AO.
To answer your last question, yes I am using 32bit LV.
03-23-2017 02:25 PM
Yes, i still get the same error.
03-23-2017 03:51 PM
You haven't tried Ben's suggestion yet, have you? I think you'll find it to work.
Specifically, when ready to end the task call *only* DAQmx Stop and do not call DAQmx Clear. On your next iteration when ready to run the task again, call only DAQmx Write and DAQmx Start using the same task refnum.
FWIW, I agree with majoris as well that you'd be doing yourself a big favor by feeding the task with data from disk a little at a time rather than all at once.
-Kevin P