From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

Certification

cancel
Showing results for 
Search instead for 
Did you mean: 

Non default used on typedefined constants

Yep, definately it.

 

Adding an item and then reordering the cluster positions to put the new item first re-set the type def cluster constant to the default values.

I personally would call this a bug, but now I know, I'll need to rethink my practices!

 

Thank you Bill and Crossrulz for investigating this and explaining it to me.

 

Helen

CLD, CTD
0 Kudos
Message 11 of 21
(5,395 Views)

Hi !
I know last post is a little bit old... but I'd like to understand the problem better. And it may answer sevral interesting questions.

Does that mean I should better do this ? :

typedefined

Both cluster and enum are typedefined. Both are statically set in the diagram (more or less something like #define ??).

What is the point of calling a bundle action to set a constant enum value into a cluster constant ?

What does the compliler on this point ? Does it really code the action described upper or does it decide its constant folding and statically defines the value to enqueue in the FIFO ?

 

If values are resetted when control order in the cluster is changed, I consider this behavior a bug. How the control order could influence the values contained in instances relying on the typedefinition ?

If it can be considered a bug, how can this point be evaluated in a CLD exam ?

CLA, CTA, LV Champion
View Cyril Gambini's profile on LinkedIn
This post is made under CC BY 4.0 DEED licensing
0 Kudos
Message 12 of 21
(5,146 Views)

You know, it probably is time I chimed in on this with a more though answer.

 

Lets get the wired terminals out of the way first since the OP did mention them too.  They are bad but check your use case and be aware of what is really going on.

The Famous Clear as mud thread explored the issue in details that I won't repeat.  I nuggeted a summary over here

 


What is the point of calling a bundle action to set a constant enum value into a cluster constant ? Its that darned "View as Icon" option for clusters.  Frankly, I would change the VI Analyzer test to exclude any cluster constants with visible non-default labels.  But that is asking for a "too specific" test and, there is no test available to require the cluster constant label to actually reflect the design requirements!  When your project scope creeps (and it always does) and you need to change that typedef you are going to have to touch everywhere you used the typedef and verify the right action is still taking place (That's where Constant Labels come in very handy) Bundling the non-default elements gives you that same visibility! the typedef update then becomes a job for your intern not the lead architect! Just select the value that matches the label Press Ctrl+G rinse, repeat

What does the compliler on this point ? Does it really code the action described upper or does it decide its constant folding and statically defines the value to enqueue in the FIFO ?  Probably, if full optimizations are applied, the object (complied code) is identical and there is no performance impact between the two methods you showed.  I am guessing.  I have a good history of guessing but, that compiler is getting smarter and smarter.  I would not expect a performance difference unless you were single stepping, had highlight execution on or had a breakpoint or probe in the folded code.

 

 

If values are resetted when control order in the cluster is changed, I consider this behavior a bug. How the control order could influence the values contained in instances relying on the typedefinition ? The instances can't really tell exactly what changed in the typrdef-  If you have labels on the cluster elements the bundle/unbundle nodes TRY to find a element with the same data type and label or, the name turns black (Broken)  All the callers really know though is "something changed."  The safe thing to do is to load default values.  It is not a bug the IDE cannot read your specification change request that is all on the dev team. You did, of course, define your default values as a "Safe" condition right?  (Hint: every enum I define as "XXX Cmd.ctl" contains "Null=0"!) 

 

If it can be considered a bug, it cannot be a bug- undefinable behavior is a bug this works as intended- NOW, IF you could find a means to tell the type def instances that "Oh its just a minor change try to not break" I would create ghost accounts to kudos that on the Idea Exchange and harass every developer I ever worked with to do the same how can this point be evaluated in a CLD exam ? There is a VIA test for that! Guess what? the evaluators will run VIA on your project.  (I think they are REQUIRED to)

 


"Should be" isn't "Is" -Jay
0 Kudos
Message 13 of 21
(5,127 Views)

FYI - in LV 2014, it would seem that reordering the controls in a typedef cluster no longer resets the typedef constants to default values!  I wonder if you will still get points off if you use typdef constants directly?

 

When did this happen?

Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
0 Kudos
Message 14 of 21
(5,074 Views)

There was also something about that a change was being applied as long as the VIs that were using the constant were in memory. If you apply a change to the type def while the VIs using it are not in memory and you load them after the change, the enum constants might still change.

0 Kudos
Message 15 of 21
(5,041 Views)

Yes, in 2014, LabVIEW will try to give you the opportunity to review each typedef instance if "applying changes" will cause you to lose default data. However, if you want to keep those default data, you must unlink from the type def. 

Here is some more documentation on this:
http://zone.ni.com/reference/en-XX/help/371361L-01/lverror/edtypedefrequiresmanualupdate/

0 Kudos
Message 16 of 21
(5,017 Views)

Hi Daniel,

 

thanks for the link. It stays however that changing the default state of a type def is still an unsafe development practice. Am I right to think that we shouldn't expect the exam grading criteria to get changed and changing the default states of type defs will continue to get penalised?

 

cheers

 

Mathis

0 Kudos
Message 17 of 21
(5,015 Views)

You are correct -- it is still an unsafe development practice and the exam grading criteria will not be changed. 

0 Kudos
Message 18 of 21
(5,012 Views)

I think I can offer a historical insight why the default of a type def'd cluster constant should not be changed.

 

In about LV 5.1 I had multiple diagrams open on a very slow machine when I changed a typ-defined cluster. The machine was slow enough for me to watch the update process in action and at that time I witnessed the diagram constant one by one being removed and then replaced with the new version of the definition.

 

I warned of this issue in my Nugget on Type Definitions that can be found here.

 

I hope that helps,

 

Ben

 

Spoiler
(Yes I know this is an old thread but so am I)
Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 19 of 21
(4,956 Views)

i just looked at the CLA exam solution for the ATM and found

 

Non_Default.png

 

Bad bad bad.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 20 of 21
(4,935 Views)