LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

not enough memory to complete the operation

I need to put the VI into a biger VI but it always give me the same error " not enough memory to complete this operation". is any one have a way around it. becuse I need 200,000 point to complete this operation.

0 Kudos
Message 1 of 12
(3,174 Views)

Hi engomar,

 

so you want to create an array of 321 × 200000 elements? After some type conversions you need that array to hold CDB data?

Each CDB takes up 16 bytes of memory!

 

Ever typed this into your calculator?

321 × 200000 × 2 × 8 = 1027200000 ~= 980 MiB

 

Does your computer allow to handle that much memory?

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 2 of 12
(3,168 Views)

My computer is 8 GB RAM but I am using just small portion of it. I heard that LabVIEW have a limitation on RAM access is there a way  to increase these limitation,

second of all, when ever I change the data type to single precision, it always give me a coercion dot.

0 Kudos
Message 3 of 12
(3,154 Views)

Hi engomar,

 

do you use LV32bit or LV64bit?

 

Using LV32bit you are still limited to less than 4GB on a 64bit OS. And you do create several data copies in your code requiring LabVIEW to allocate more than just one big memory block to handle all your data!

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 4 of 12
(3,151 Views)

I use LV32 bits, if I change the whole system to single persion do you think it will work, and how?

0 Kudos
Message 5 of 12
(3,135 Views)

Hi engomar,

 

"change the whole system to single persion"

???

What do you want to change?

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 6 of 12
(3,133 Views)

I think that if I changed the whole eponantial to single percsion then I reduce the memory size to half, what do you think?

0 Kudos
Message 7 of 12
(3,130 Views)

here is the whole VI

0 Kudos
Message 8 of 12
(3,126 Views)

Hi engomar,

 

well, it could reduce memory footprint a little. You still have additional buffer allocations in your snippet as you can easily check using the Tools->Profiling menu item!

 

check.png

2pi and "0"-constant set to SGL as well…

With 50k as loop count the VI still needs ~440MB…

 

Why do you need such a big array? Can't you work with smaller array(s)/subsets?

 

Edit:

After looking at your VI: so you want to do some math on the big array created before in combination with 6 other big arrays? Do you really think by setting that exp-array to complexSGL will help in any case?

Please rethink your algorithm!

Work with smaller subsets of all your arrays!

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 9 of 12
(3,125 Views)

Hi GerdW,

 

I tried to think of how I reduce my algorithm, but the problem is that its have to be 200,000 points because it will depend on other VI that my coworker did.

 

The VI accept 100,000 without any problem, is there away to divide it into 2 groups then add them together? Like maybe for loop, or duplicating the VI.

0 Kudos
Message 10 of 12
(3,111 Views)