LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Memory allocation for an array of clusters

In fact I have two questions:
 
1) If I have an array of clusters with a million clusters inside of it. But I am using just the cluster #999.999 in this array, is LabVIEW intelligent enough to allocate just the memory used in this cluster #999.999 or it allocates memory for this one million positions based on the size of the one I'm using?
 
2) Based on the answer #1, I want to know how to calculate the memory of this array of clusters. Does the cluster has an overhead or its size is just the size of its elements? (e.g. 10 double numerics)
 
Well, please let me know if I wasn't clear enough in my doubts.
0 Kudos
Message 1 of 5
(2,610 Views)
I don't really understand,
you have an array with clusters inside it.
The array size is 1000000, LabVIEW will create this in memory as one block.

Their is hardly an overhead:

You can calculate the total size by flattening it to string and than reading the size of the string.

Ton

Free Code Capture Tool! Version 2.1.3 with comments, web-upload, back-save and snippets!
Nederlandse LabVIEW user groep www.lvug.nl
My LabVIEW Ideas

LabVIEW, programming like it should be!
0 Kudos
Message 2 of 5
(2,606 Views)
Hi Eduardo,

1) When you have an array of a million elements, then LV will reserve memory for a million elements.
2) Well, there certainly is some overhead as LV needs to know what's inside the array/cluster. But: for an array all elements are the same type so LV needs to know the type only once. But2: the size of the elements may differ (e.g. sub arrays or strings), so it's getting harder to calculate memory footprint...

As long as you only use DBL values you can easily guess memory allocation by (number of DBL in cluster)*(8 byte per DBL)*(number of clusters in array).
Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 3 of 5
(2,607 Views)

Pink vs brown

Brown cluster will have a size that can be predicted.

Pink wil not.

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 4 of 5
(2,599 Views)
On Jul 13, 7:40 am, eduardo84 <x...@no.email> wrote:
> In fact I have two questions:
> &nbsp;
> 1) If I have an array of clusters with a million clusters inside of it. But I am using just the cluster #999.999 in this array, is LabVIEW intelligent enough to allocate just the memory used in this&nbsp;cluster #999.999&nbsp;or it allocates memory for this one million positions based on the size of the one I'm using?
> &nbsp;
> 2) Based on the answer #1, I want to know how to calculate the memory of this array of clusters. Does the cluster has an overhead or its size is just the size of its elements? (e.g. 10 double numerics)
> &nbsp;
> Well, please let me know if I wasn't clear enough in my doubts.

Labview is intelligent enough to use memory space based on the size
what user is using, unless you initialize array with million clusters.

labview has good example vis to calculate memory used by any specific
vi.

0 Kudos
Message 5 of 5
(2,583 Views)