LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

CPU memory

Hi,

I am using labview application for testing purpose, where I am doing many file operations. My problem is even after closing all the vi's my CPU memory is almost the same, CPU memory is coming down only when I exit the labivew.

 

Is there any logic/palette to kill all the memory used by labview

 

please suggest

phani srikanth
0 Kudos
Message 1 of 9
(2,688 Views)

There are things you can do to reduce the memory usage of a Labvew program but without seeing your code nobodty can give you any detailed soutions.

 

 

 

========================
=== Engineer Ambiguously ===
========================
0 Kudos
Message 2 of 9
(2,668 Views)

Attached the screen shot of the code.

Here I am trying to ready *.csv files and decimating it (since those are large files) and taking it to a 2D array. Which I will be using data for the plotting

 

thanks in advance

phani srikanth
0 Kudos
Message 3 of 9
(2,645 Views)

Well I can;t tell much from your jpg because it's not actulal code. I see you are bulding a huge array, arrays are stored entirely in memory. What is going on in your decimate sub vi?

========================
=== Engineer Ambiguously ===
========================
0 Kudos
Message 4 of 9
(2,635 Views)

Hi RTSLVU,

 

it is the actual code. Yes I am building a 2D array (Max 100000 x 61).

 

I have ~25979923 x 61 number of rows and columns of data in the all the files (refer screen shot)

since I can't plot the huge data. I am decimating them.

 

Decimate Sub VI:

receives the 1D array of line (coma separate)

decimates the 1D array to the specified scaling factor

I am using "spreadsheet string to array " to make the coma separated lines to 2D array

 

 

 

phani srikanth
0 Kudos
Message 5 of 9
(2,628 Views)

You have a data file that has over 1,584,775,303 data points? Smiley Surprised

 

I do not think Labview is the proper language to be manipulating this large of data set.

 

What exactly are you trying to do?

========================
=== Engineer Ambiguously ===
========================
0 Kudos
Message 6 of 9
(2,623 Views)

🙂

I am running a test for 72 hrs which acquires 15 temp channel, 12 current channel values with 10ms data log duration.

I am able to do this operation with some memory issues. I am trying to minimize the memory issue also.

 

 

phani srikanth
0 Kudos
Message 7 of 9
(2,619 Views)

You might want to consider using TDMS

 

File I/O palette  -->TDM streaming

 

and the Excel TDMS plugin to manage this large of a dataset.

 

========================
=== Engineer Ambiguously ===
========================
0 Kudos
Message 8 of 9
(2,608 Views)

Start by looking at two places:

 

http://www.ni.com/white-paper/3625/en/

http://www.ni.com/white-paper/6211/en/

 

One or both of those may solve what you need. I have worked with the GigaLabVIEW stuff before and found it to be... not fantastic. I use it as a last resort. Your best bet is to combine the in place element node and data value reference to prevent LV from duplicating your array in memory.

 

 

0 Kudos
Message 9 of 9
(2,591 Views)