LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Performance and memory management issues

I have a data set I am reading into an array, performing computations on that data to create a second array, and then putting the data through a Butterworth filter to create a third array. 

 

When I perform the operation on 50K data points the operation takes approx 15 seconds to complete with no issues.  When I bump the resolution up to 500K, the operation completed but takes 30 min or so.  When I bump the resolution up to 50M data points (where I need to be), I get an error “Not Enough Resources” and Lab View crashes all together.  The crash happens during the building of the first array.

 

So, my question is two part,

 

1: How can I address the apparent memory management issue Lab View seems to be experiencing on large data sets?

 

2: How can I improve performance?

 

I chopped out the relevant parts of the VI and screenshotted them. Image is attached.

 

Any help will be appreciated. 

0 Kudos
Message 1 of 4
(2,938 Views)

local variables are a copy of the data. Also try to get rid of the build array.

http://zone.ni.com/devzone/cda/tut/p/id/3625

0 Kudos
Message 2 of 4
(2,923 Views)
Here it is with some optimizations added (the biggest by far are not using local variables and build array in the for loop). I'm not sure if it works exactly the same but should give you some ideas. As a general rule avoid local variables, and use data flow to decide order of operations instead of sequence structures (this can't always be avoided though).
0 Kudos
Message 3 of 4
(2,897 Views)
Thanks Matt W!  Moving the array operation outside of the for loop did the trick!  I can now process 16M records in about 2 min.  I still get a resource error when I go larger than 16M, but at that resolution I can just chunk the data as a work around.
 
Thanks again!
0 Kudos
Message 4 of 4
(2,885 Views)