I have a related observation/question: Reading some of the "patterns are anti-patterns" rhetoric on LAVA (https://lavag.org/topic/11603-labview-anti-pattern-action-engines/?page=2#comment-125006) and criticisms of design patterns on wikipedia (https://en.wikipedia.org/wiki/Design_Patterns), there seems to be the sentiment out there that patterns point out deficiencies in programming languages - in the sense that if you find yourself repeating a sequence of steps to implement particular functionality, maybe the language should be able to do that sequence for you. I mention this in regard to the Actor Framework <ducks rocks, stones, GoF bobbleheads>.
It seems to me that NI recognized the recurring use of queued-event-driven state machines with consumer/producers and addressed the issue by designing a framework around them. However, mentioning OO in the presence of LV developers - let alone AF - is often met with groans, pained expressions and projectiles. So the options seem to be: (1) The traditional perspective that if you think you need some of that OO stuff, you just haven't learned enough LabVIEW tricks yet (a.k.a. "you can do anything OO-like without OO in LV"); (2) Use OO patterns like the GoF intended and be grateful that they imparted their wisdom upon the masses; (3) Use the AF because it represents the lessons learned from repeated use of design patterns.
Is there a repository of guidance somewhere that can walk one through the architecture process to determine if one requires a non-OO LabVIEW pattern, an OO pattern or the AF? Like, maybe a decision tree flowchart type thing?
One final thought: When it comes to architecture, LabVIEW is in a sense its own worst enemy - the language makes it so easy to just "start coding" and accomplish something useful that it almost seems antithetical to step back and think about architecture. But without an architecture, complexity can overwhelm utility. Anticipating when this is likely to happen seems to be the key to whole thing.
What experienced LV developers are you around that groan about using OOP? 99% of the code I write is OO, as is the case for most of the experienced developers I have worked with. The decision to use OOP or AF is never universal or black and white - OOP may be a good choice for one developer, and a terrible choice for another. If you're not experienced developing OO systems and have to start a new, time critical project, learning OOP as you go probably isnt the best decision.
I find that using OOP makes my code cleaner and easier to understand and my development process faster.
D'Oh, just realized the discussion is 2 months old and has long passed the post I was responding to.
Sorry about the long quote but, I don't want anyone to take this out of context.
OO programming and programming with OO principles are different. LVOOP enforces OO Programming but ( in my experience) requires the programmer to have a knowledge of OO Principles to write and maintain.
I'm glad you asked THIS question on THIS thread! IMO OO principles can be imparted to novice LabVIEW developers by:
- sharing basic design patterns
- Vi templates
And code review best practices.
AN AWARD for "BOTW" (bug of the week) passed by general consensus, is a learning opportunity for the right group.
Now, AF and DQHM, are frameworks that make right decisions about how they can be used But, project templates and VI Templates could help to add non-class based extensions to one-at-a-time solutions. (Yup, done that )
Boil it down, there exists now, "Special" solutions. There is no "General Solution to Programming "
(That IS, HOWEVER, how programmers keep "the rain off our heads and soup between sternum and spine " to paraphrase R.A.H.)