LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Indicator reverts to default after set control value

  I've noticed that if I try to programatically set the value of an indicator (which is in the connector pane) using the Set Control Value method it reverts back to its default value once the VI is finished executing.  If it's not in the connector pane I don't have this problem.  If I use Set Control Value, then read from a local variable of the indicator to set the indicator I also get no problem.  Why is this?  I've attached two block diagram images below showing what I did. 
I'm interested in this because I'm trying to write a subvi that retrieves and stores a bunch of global values from a config file.  The easiest way to do this seemed to me to use
0 Kudos
Message 1 of 11
(3,977 Views)
Your attachments did not make it.
0 Kudos
Message 2 of 11
(3,974 Views)
See a couple threads down.  Accidentally posted before I was finished typing the message.
0 Kudos
Message 3 of 11
(3,972 Views)
0 Kudos
Message 4 of 11
(3,949 Views)
This might help you.

___________________
Try to take over the world!
0 Kudos
Message 5 of 11
(3,947 Views)

Thanks, tst.  I can understand if you do nothing to an indicator why it returns its default value.  Its seems a little perverse that changing the value programatically doesn't have the same effect as wiring to an indicator (particularly when I can read the new value from a local!).  I was also fooling around a little with the save front panel routine and found that if I used a property node instead of the Set Control Value invoke method the new value remains. 

Just so others can follow the thread here's my original post with the images:

I've noticed that if I try to programatically set the value of an indicator (which is in the connector pane) using the Set Control Value method it reverts back to its default value once the VI is finished executing.  If it's not in the connector pane I don't have this problem. 

Yet, if I use Set Control Value, then read from a local variable of the indicator to set the indicator I also get no problem. 

Why is this?  I'm interested in this because I'm trying to write a subvi that retrieves and stores a bunch of global values from a config file.  The easiest way to do this seemed to me to use the OpenG vi that reads and writes the front panel controls to a config file (since some of my controls I wish to save are complicated).  It didn't seem to work unless I added the second step of using the local variable to update the indicator. 



Message Edited by Phamton on 11-23-2007 02:14 PM
0 Kudos
Message 6 of 11
(3,917 Views)

Hi Phantom,

It sounds like an Action Engine may help you. The Save Restore Nugget may also be of use.

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 7 of 11
(3,896 Views)
I think what you originally saw was a race condition - LabVIEW saw that your indicator had nothing wired into it and so immediately took the default value. Your first work around solves this by wiring something into the indicator.

Your second workaround seems to suggest that LabVIEW already handles some of the cases, although I'm not sure why going over all the controls changes this. Maybe the codes that decides which values to take only runs when the VI finishes in this case. I'm not sure whether the behavior you originally saw is a bug, but I would vote that it's probably unwanted behavior. Personally, I would say it's not worth it if it hurts performance.

As for loading and saving values, Ben pointed you to his large nugget which also references the OpenG solution. In both cases, it's better to save a typedef cluster than a panel with controls. If you do want to save and load controls directly, however, you can do something like this:



P.S. If you don't wire a reference into a VI property or invoke node, it defaults to the current VI. Also, LV 8.x has an explicit This VI reference constant.

___________________
Try to take over the world!
0 Kudos
Message 8 of 11
(3,892 Views)

I think this behavior has been around a while -- it sure sounds similar to some quirkiness I had to work around a while ago.  In my case, I was using "Set Control Value" to configure the inputs of a vi before launching it using VI server.  I found that the launched vi would sometimes use the default value instead of the value I set.  In some cases, I had confirmed the "Set" with a subsequent "Get", but the running code *still* used the control's default value instead.   (And similar issues using "Get Control Value" to query indicators of the spawned vi.)

Like you, I also found that running a wire out of a control (or a local variable copy of an indicator) would be enough to ensure that the value I set using "Set Control Value" would be used instead of a default.

So, no real answer here.  Just confirming the seeming oddness of your observation.  In fact, I find yours even more curious than mine.  I was manipulating control values in a different vi prior to launch -- you're manipulating control values within the same vi.

-Kevin P.

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 9 of 11
(3,863 Views)
Thanks for all the replies.
 
Ben, I actually am trying to use this save/restore functionality in an action engine.  Your save/restore nugget looks amazing!  Is it available in 7.1?
 
tst, I don't think it was a race condition, per se.  Labview set the indicator to the default value only when the vi exited so if I ran this as a subvi I'd always obtain the default value.  For example, if you put a pause in my original example you could see the indicator display the Set Value up to the point where the vi finished, then it reverts to the default value.   
    If I understand your comment about using a typedef cluster correctly, that seems like a better solution for me than using the OpenG Read/Write Panel to INI vis.  I was using them because I liked the idea of not having to change my code every time I added a new control.  Of course, since these OpenG VIs used the invoke method, this issue with reverting to default values negated that advantage, anyway.  You are suggesting that I instead cluster together the controls I want to save/restore, typedef the cluster, then save/restore the cluster control as in your example, right?  Since your example uses the control property nodes it'll avoid the default value issue as well. 
  
Kevin, it does sound pretty similar to your problem.  I tried checking with Get Value after Set Value and found it did return the new value even when the indicator returned the default value.  Interestingly, it seems just making a local variable of the indicator (i.e. you don't have to necessarily wire it to the indicator) is enough to keep Labview from reverting it to its default value.
0 Kudos
Message 10 of 11
(3,843 Views)