From 04:00 PM CDT – 08:00 PM CDT (09:00 PM UTC – 01:00 AM UTC) Tuesday, April 16, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Limitation of data value references (DVR) in system

Hi Everyone,

 

I have to implement a feature which uses a tree (DVR based), and I just realized now, I cant create more than 2^20 DVR, because LabVIEW throws "Memory full" exception. Here is a little discuss: forum link.

 

Is it really a LabVIEW limitation, because it seems to me a really hard limit 😞

 

Thank for help,

 

Balint

0 Kudos
Message 1 of 16
(3,987 Views)

In reading that post, I think AQ answers your question.  What more do you need?

 


@AristosQueue (NI) wrote:
I haven't dug into all the details of the algorithm, but the result is that you can only have 1,048,576 refnums of the same type open at any given time. After that, we can't allocate more until you free some -- you'll still get a unique refnum when you do this. 

 

aputman
------------------
Heads up! NI has moved LabVIEW to a mandatory SaaS subscription policy, along with a big price increase. Make your voice heard.
0 Kudos
Message 2 of 16
(3,922 Views)

What I would like to know: 🙂

  • why exists this limitation
  • how could I workaround
  • is there any chance, it will be changed in the future
0 Kudos
Message 3 of 16
(3,913 Views)

@ViltBalint wrote:

What I would like to know: 🙂

  • why exists this limitation  Because computers have limited resources.
  • how could I workaround  Don't know, you'll have to show us what you are trying to do.
  • is there any chance, it will be changed in the future  Maybe with a change in OS like going to 64-bit.  Otherwise, I doubt it.

This is where we start asking, what are you really trying to do.  So far your post only talks about using a tree.

A million different references is quite a lot.  Without know exactly how you are using a million references in your tree, it is kind of hard to suggest how to work around the limitation and give you a better alternative for your architecture.

Message 4 of 16
(3,904 Views)

A few comments for your answers:

  • why exists this limitation  Because computers have limited resources.
    • Quite a general statement
    • I think, this limitation exists becuase in the interest of an efficient garbage collector (?), but why throws "Memory full" exception (overall LV consumed memory ~300MB at this time)
  • is there any chance, it will be changed in the future  Maybe with a change in OS like going to 64-bit.  Otherwise, I doubt it.
    • Could you explain me, why did you say that? It seems to me, this is independenty from operation system.

So, you asked what I try to do. Generally I would like to build a hierarchical model, which has to support an efficient navigation among the model elements. Currently it seems, in our application there is at most ~100K model elements, but it depends from our useres, and in the future it is not inconceivable, it will grows in a magnitude. Just think about a real life model from AUTOSAR  domain, which may contains +10M elements. I know, there is other approachese, e.g.: I could store my element into a simple array, or every element could be contains arrays instead of pointers, but in that case, I will lost my flexible software architecture. A simple example: I would like to do someting on a model element. If I have a "next" edge on every model element, I dont have to traverse a part of my model and not necessary to store a lot of indexes, where was my element in the "tree".

In nutshell this is my problem, maybe I forgot about a simple thing, so if you have any (even architectural) idea, I would be grateful.

 

Thanks,

Balint

 

 

 

0 Kudos
Message 5 of 16
(3,885 Views)

So, what is answered. Your problem must be in the HOW.

 

 

To fix how we need code.


"Should be" isn't "Is" -Jay
0 Kudos
Message 6 of 16
(3,867 Views)

ViltBalint wrote:

So, you asked what I try to do. Generally I would like to build a hierarchical model, which has to support an efficient navigation among the model elements. Currently it seems, in our application there is at most ~100K model elements, but it depends from our useres, and in the future it is not inconceivable, it will grows in a magnitude. Just think about a real life model from AUTOSAR  domain, which may contains +10M elements. I know, there is other approachese, e.g.: I could store my element into a simple array, or every element could be contains arrays instead of pointers, but in that case, I will lost my flexible software architecture. A simple example: I would like to do someting on a model element. If I have a "next" edge on every model element, I dont have to traverse a part of my model and not necessary to store a lot of indexes, where was my element in the "tree".

 


Sorry, I don't understand your lingo, but what is "the model"? Are the elements static (only acting as a lookup table) or dynamic (inserting/deleting elements/branches, at any time, etc). What is the datatype of a "model element". What defines the relationships? How do you access the elements?

 

Depending on what you need, variant attributes might be a solution. They are implemented as a red-black tree with O(logN) search/insert/delete complexity. (see also the discussions of this idea)

Message 7 of 16
(3,862 Views)

@ViltBalint wrote:

A few comments for your answers:

  • why exists this limitation  Because computers have limited resources.
    • Quite a general statement
    • I think, this limitation exists becuase in the interest of an efficient garbage collector (?), but why throws "Memory full" exception (overall LV consumed memory ~300MB at this time)
  • is there any chance, it will be changed in the future  Maybe with a change in OS like going to 64-bit.  Otherwise, I doubt it.
    • Could you explain me, why did you say that? It seems to me, this is independenty from operation system.


It is not a "general statement".  It is an accurate statement.  If a computer didn't have limited resources, then it would have unlimited resources.  Do you think computers have unlimited resources?  There is a finite amount of memory.  There is a finite amount of hard drive space.  There are limits on the numbers that access those resources by address.

 

I don't know what your comment about "inefficient garbage collection" has to do with anything.  That is wild speculation on your part.  Whether the garbage collection you are speculating about is perfectly efficient, or non-existent because there is no garbage to collect, there are still limitations on what a computer can do.

 

It has everything to do with the operating system.  If the operating system limits you to 32-bit address schemes, then your limitations aren't going to grow unless you move to a system that uses 64-bit address schemes.  And the next step beyond that is 128-bit schemes.  The ultimate limits of what a PC can do is always directly dependent on the operating system.

0 Kudos
Message 8 of 16
(3,848 Views)



It is not a "general statement".  It is an accurate statement.  If a computer didn't have limited resources, then it would have unlimited resources.  Do you think computers have unlimited resources?  There is a finite amount of memory.  There is a finite amount of hard drive space.  There are limits on the numbers that access those resources by address.

 

I don't know what your comment about "inefficient garbage collection" has to do with anything.  That is wild speculation on your part.  Whether the garbage collection you are speculating about is perfectly efficient, or non-existent because there is no garbage to collect, there are still limitations on what a computer can do.

 

It has everything to do with the operating system.  If the operating system limits you to 32-bit address schemes, then your limitations aren't going to grow unless you move to a system that uses 64-bit address schemes.  And the next step beyond that is 128-bit schemes.  The ultimate limits of what a PC can do is always directly dependent on the operating system.


You just didn't mention it, if there isn't power supply, I cant turn on my PC. In your comment you listed a lot of resources, only one exact answer is missing: where should I search the root cause of this bevaiour?

  • Responsibility of the OS (e.g: cannot allocate memory)?
  • Is it a design decision from NI (e.g: performance issues)?

About garbage collection: yes, it is only a wild speculation, because nobody says anything that is convincing 🙂

 <off>I do not want this post to be a quarrel</off>


Answer for Altenbach:

Sorry, I don't understand your lingo, but what is "the model"?)

I mean about "model" the abstraction of the real world. A simple example: I would like to store my family tree, which consists from person entries.

 

Are the elements static (only acting as a lookup table) or dynamic (inserting/deleting elements/branches, at any time, etc). What is the datatype of a "model element".

Dynamic. Datatype is depends on what would you "modeling".

 

What defines the relationships? How do you access the elements?

Relationships defines connections between model elements, e.g.:  "A" and "B" child of "C", "C" parent of "A" and "B", "A" sibling of "B".

 

So, the model is in my aspect the "what" and the "how" is the open question, and thank for your hint, maybe it will be stored in a variant.

0 Kudos
Message 9 of 16
(3,828 Views)

Who knows what the exact reason is you get an out of memory message.  You haven't posted any code for anyone to experiment with to see if they get the same results.  Although, I realize it might be difficult to replicate.

 

At some point of time, you need to realize that having 2^20 of a reference to a resource is a ridiculously large number and start thinking of a better architecture that doesn't generating that many references.

0 Kudos
Message 10 of 16
(3,812 Views)