LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Typedefs and Class Design


jgcode wrote:

The Cluster is Privately Scoped and you don't need normally need to bother with a typedef in this case.


Are you advocating a non-classed non-typedef'd clustered data member inside the Private Class Data? As long as the data remains completely private access, there are no maintenance problems with the callers, yet there's still a maintenance problem within the class itself. Am I missing something?

0 Kudos
Message 21 of 77
(2,510 Views)

@jgcode wrote:

I also don't agree with exposing Clusters, unless under very specific constraints - i.e. easily bundling up and wiring all UI references into a Class that acts as the UI Console.


Another use case I implied but didn't state is using clusters to transfer data between two class within a library.  (Just make absolutely sure the cluster isn't exposed outside of the library and there are no naked vis in the library that use the cluster.)  I generally favor using classes in these situations anyway, but I don't have any strong objections to clusters here.  Although the auto-loading behavior of class methods does frequently frustrate me** it is useful in this situation.

 

(**Doh!  I just realized why classes have to auto load all their members.  The class ctl essentially is a typedeffed cluster and if all the class members weren't loaded into memory we'd have exactly the same problem with jiggling the class ctl that we do with jiggling the typedeffed enum.  As always, there's a good reason for decisions that on the surface appear questionable.  And true to form, it takes me a little longer to connect the dots and see the bigger picture.)

Message 22 of 77
(2,499 Views)

I cannot advocate not using typdefs. This is just crazy talk. I've always had cases where I've been burned without typdefs. Even with classes. Why take the chance?

Also can't trust auto class mutation. I've been burned once (and once is enough on a customer project) where my customer couldn't load old data! Class mutation didn't work for me. Too much black magic I don't have any control over. I would be Ok if it worked well but I don't have confidence in it, sorry. Also, it would be nice if I could call my own mutation routine. This would be a good alternative. Not sure if this is possible already.

It's so easy to provide your own converter. You have to make sure to provide a version number for your data that is not part of your data. I do this all the time and I feel confident knowing what is converted to what and when.



Michael Aivaliotis
VI Shots LLC
Message 23 of 77
(2,480 Views)

 


@JackDunaway wrote:

jgcode wrote:

The Cluster is Privately Scoped and you don't need normally need to bother with a typedef in this case.


Are you advocating a non-classed non-typedef'd clustered data member inside the Private Class Data? As long as the data remains completely private access, there are no maintenance problems with the callers, yet there's still a maintenance problem within the class itself. Am I missing something?


 

 

I used to create typedef'd Cluster with Private Scope for the Use Case of File IO mentioned previously, but if I never use this Cluster as input/outputs for a class method VI (and I don't want to - its private data, so I can always get access to it), then it doesn't need to be typedef'd - so it just ends up being an extra file on disk and I stopped doing it.

 

The point of the Cluster is to provide a container that holds persistent data to make it easier to use e.g. OpenG Variant/Ini Vis (as per the MGI example). This container holds the Class' persistent data that is to be serialized. It is only a container, not a real data structure therefore, I can't see its bad practice in this case. However, if I ever need to pass it around as a data structure then yes I would typedef it, not to do so would be stupid.

 

Also, when writing your own mutation routine, at release time, this Cluster gets 'snapshot'd' and saved as a Control (no links) file so you always have the blueprint for that version. 

 

 

Certified LabVIEW Architect * LabVIEW Champion
Message 24 of 77
(2,470 Views)

@Michael Aivaliotis wrote:

I cannot advocate not using typdefs. This is just crazy talk. I've always had cases where I've been burned without typdefs. Even with classes. Why take the chance?

Also can't trust auto class mutation. I've been burned once (and once is enough on a customer project) where my customer couldn't load old data! Class mutation didn't work for me. Too much black magic I don't have any control over. I would be Ok if it worked well but I don't have confidence in it, sorry. Also, it would be nice if I could call my own mutation routine. This would be a good alternative. Not sure if this is possible already.

It's so easy to provide your own converter. You have to make sure to provide a version number for your data that is not part of your data. I do this all the time and I feel confident knowing what is converted to what and when.


Two good points that I would like to reply too.

 

1) I had filed away an issue with;

 

 using type defs inside class data as being an issue when trying to restore early versions of the class that used an older version of the type def.

 

 Is that the condition that burned you?

 

2) Mega-dittos to the "too much black magic" comment but in my case that applies to things like NSV's etc.

 

But stepping back and through out some ideas...

 

I find myself with a foot on both shores. We (Data Science Automation) have had the unique oppertunity of growing up with LV and have had to addapt our coding styles and practices to addapt to the changes and harness the power exposed by LV.

 

E.G.

About 11 years ago I was trouble shooting a race condition with my boss (the 5th person in the world to certify as a CLA) and it hit us that now that LV is multithreaded, globals may be a bad idea. (mega-duh).

 

So we have been developing and delivering solid LV apps based on non-LVOOP solutions for years. Everyone in the shop understands AE's Type def's and modular design and they can deliver quality apps without my help and without LVOOP.

 

With the conversion of some of the add-ons to LVOOP (report gen tool kit for example) my peers have run into issues with getting exe's to run correctly due to the required build specs to get it included in the apps.

 

Since the "non-LVOOP version gave them no trouble and the new stuff did, LVOOP (via the introduction vai tht tool-kits) has left a sour taste in the mouth of some of my peers when it comes to LVOOP.

 

I still drove forward and introduced LVOOP into the projects I was working though (since it is my job to explore the outer teritories of LV) and all was fine until my schedule got busy. My most simple project was trnasfered to a peer while I was out of the country. While I was gone, my associate ran into difficulty trying to get a build working using LVOOP. Budget said "no time to play" so my LVOOP code was replacede with AE's and type defs and the app was delievered on time an on budget. Unfortunately my LVOOP code ended up in the virtual trash can.

 

I did get a chance to do a post-mortem and found out that one of the dynamically laoded classes was not being included in the build as being the issue and the code and LVOOP were just fine.

 

So I have a double up-hill battle due to the learning curve of OOD but also doing so without interupting a powerful development team with good pratcices and a proven track record.

 

For my part, I have riden the technology wave since 1976 by going with my gut instincts about the tidal currents. From what little I have learned about OOD, there is just too much logic and well thought out theory immersed in that sea of knowlege to ignore let alone try to swim against its currents.

 

Done waxing (my surf board) philosophical,

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 25 of 77
(2,448 Views)

Ben, I'm not afraid of LVOOP. I use it everyday and I love it. So no problems there.

 

I've already known (before this thread), that automatic up-conversion of class data breaks when using typedefs. However, I found that out after I tried to use the feature for the first time and got burned.

 

But now NI is asking me to drop usage of typedefs altogether in class data if I want to use the automatic conversion feature of classes. That's fine but I find it very tedious to program with this restriction.



Michael Aivaliotis
VI Shots LLC
Message 26 of 77
(2,442 Views)

> But now NI is asking me to drop usage of typedefs altogether in class data if I want

> to use the automatic conversion feature of classes. That's fine but I find it

> very tedious to program with this restriction.

 

No. That's way overstating it. The only typedefs I (and not necessarily NI) am suggesting you give up are typedefs of clusters. If you need the data as a cluster outside the class, make the cluster itself be a class. If you only need it inside the class, don't make it a typedef.

0 Kudos
Message 27 of 77
(2,433 Views)

@Michael Aivaliotis wrote:

Ben, I'm not afraid of LVOOP. I use it everyday and I love it. So no problems there.

 

I've already known (before this thread), that automatic up-conversion of class data breaks when using typedefs. However, I found that out after I tried to use the feature for the first time and got burned.

 

But now NI is asking me to drop usage of typedefs altogether in class data if I want to use the automatic conversion feature of classes. That's fine but I find it very tedious to program with this restriction.


Please forgive me if I implied that you were afraid (far from it!).

 

The import point I wanted to makes sure I had clear was the part about type defs inside the class.

 

Thank you!

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 28 of 77
(2,430 Views)

@Ben wrote:

For my part, I have riden the technology wave since 1976 by going with my gut instincts about the tidal currents. From what little I have learned about OOD, there is just too much logic and well thought out theory immersed in that sea of knowlege to ignore let alone try to swim against its currents.


I couldn't have said it better Ben.  Though OOP is still a very small subset of Labview, I firmly believe it is the future and over the next several years understanding LVOOP and OO design will be a requirement for anyone claiming to be a LV developer.  Trusting that the world's leading language architects knew what they were doing when they chose OOP kept me going through the many bumps and bruises of learning it.

 

 


@michael wrote:
But now NI is asking me to drop usage of typedefs altogether in class data if I want to use the automatic conversion feature of classes.

 

I think everyone most people agree the auto-mutation feature isn't where it needs to be for advanced programmers.  It's too opaque, not overridable, and there's no user feedback when you've made a change that breaks the ability to load a prior object saved to disk.  Given the limitations, I think it's a mistake to rely on that feature if the saved object is critical to the application.  As Jarrod said, it's much better to roll your own Serialize method.

Message 29 of 77
(2,422 Views)

@jarrod S. wrote:

In my mind, if you require mutation, you should set up your own serialization and mutation scheme manually. We do this commonly in our product line, and once our convention was established for the process, it was fairly easy to keep it up. I think it's worth the initial effort.


Can you share what conventions you've established?

0 Kudos
Message 30 of 77
(2,417 Views)