LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

memory deallocation

Hello Folk,

 

I have an issue of memory with large data. I read 6 Binary files each having 2 cr samples. I iterate loop 6 times to perform needed operation. its takes 1.2GB of memory in first iteration then frees up use again 1.2 GB memory and frees up but with third iteration it takes 1.5 GB memory and gives an error "labview ran out out memory". 

 

To make this clear it is not the pop up message says "not enough memory to perform this operationi" but an error saying labview ran out of memroy. That pop up message comes when code uses around 2.6GB of memory. 

 

Kindly suggest me some ideas that deallocates memory .

Moreover suggest me on what reason could it be behind the error??

 

Your help will be truly appreciated. 

CLAD
Passionate for LabVIEW
0 Kudos
Message 1 of 13
(3,644 Views)

How many are "2 cr samples"? I am not familiar with that unit. What is the datatype?

 

Can you show us some code? What kind of operations are you doing with the data?

0 Kudos
Message 2 of 13
(3,631 Views)

Hello altenbach,

Thank you very much for your concern.

 

with 2 cr, I meant 2 crore samples. I got this data in double I convert it into single and then append columns into excel. 

 

For reference I have attached a code here with, 

 

I changed format of file from .rar to .vi please change back it to .rar to get all vis.

CLAD
Passionate for LabVIEW
0 Kudos
Message 3 of 13
(3,622 Views)

OK, crore=10M. Never heard of it...

 

You need to tell us the name of the toplevel VI (especially since most have (subVI)" in the name. Very confusing.

 

Here is my 30 seconds analysis.

 

In read binary file: wire an empty 2D SGL array to the array type input of "spreadsheet string to array". This way it converts it directly to SGL, saving you a lot of memory. (Converting to DBL first uses twice as much and then converting to SGL even more because that requires yet another memory allocation)

 

In "save to excel", remove all these "request deallocation" functions. They don' t do anything here. The sequence structure is not needed.

 

In "append timestamp", remove the breakpoint.

 

DIsable debugging for all VIs and make sure to keep the front panel of all subVIs closed during run.

 

Sorry, I never use all these excel functions but they seem like a lot of work.. Try the above suggestions first and see how it goes.

 

I will look at it tomorrow...

0 Kudos
Message 4 of 13
(3,613 Views)

Oooops! I forgot to mention name of top level of vi. Its save to excel(subvi).vi.

 

Yeah excel fuctions are really useful. Smiley Happy

 

Your suggestions are really helpful with the memory but still I stuck with the same error. 

I have attached error snap shot here with. 

 

It says possible reason could be following 

 

LabVIEW: (Hex 0x8007000E) Ran out of memory.
=========================
NI System Configuration: (Hex 0x8007000E) Out of memory.

CLAD
Passionate for LabVIEW
0 Kudos
Message 5 of 13
(3,600 Views)

I just don't believe that excel reports were ever meant to deal with such large amounts of data. That simply does not feel like the right tool.

0 Kudos
Message 6 of 13
(3,580 Views)

Excel only supports 1 million rows, could that be it?

/Y

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 7 of 13
(3,559 Views)

Hii Altenback and Yamaeda,

 

My file do have only 6 lac rows. My code logically and practically worked once but now somehow it getting error from third iteration.

 

I suffled these files reading order to make sure there is no error with file. So now I am sure that every file gets read in first two iteration but at the end of 2nd iteration  "Append Text table to Report.vi" gives an error. 

 

I made some observation.

I insert two 2 colums into table. The moment my code inserts data into column, inserted data colums are marked with green colour on top left side of cell. Next time I append data this signs gets removed and other 2 columns gets inserted with the same mark and this goes on for entire file. 

 

What I observed in last 6 channels of 2nd iteration is, the colums with green marks does not gets removed. For reference I ahve attached screen shot of the file here with. 

 

Kindly suggest me on this obeservation or the error one I get. 

 

Thank you very much for your concern. 

CLAD
Passionate for LabVIEW
0 Kudos
Message 8 of 13
(3,536 Views)

Crosspost.

 

And what is with all these strange units?  lac? cr? crore?

 

"My car gets 40 rods to the hogshead and that's the way I like it."

Message 9 of 13
(3,475 Views)

sorry for troubling,

 

I am learning to use million, billion and trillion as units. 🙂

 

And I dint get your line.

CLAD
Passionate for LabVIEW
0 Kudos
Message 10 of 13
(3,448 Views)