From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW Development Best Practices Discussions

cancel
Showing results for 
Search instead for 
Did you mean: 

Are patterns good, always?

Your quote reminds me of something I once heard - that in order to properly debug existing code, you have to be twice as smart as the original developer; so, you should code half as smart as you are...

That being said, I think this comment is overly broad.  If your application can be confined to a single synchronous process that can be encapsulated by one block diagram, ok.   But if you are building a large diagram that consists of many discrete subsystems, then a good understanding of patterns can bring clarity to the code you develop.

Personally, I believe that in large application development, it behooves you to spend some time thinking about how things might be changed or extended without breaking the front facing API.  And in order to do this, you probably ought to have some notion of patterns.

A good example of this might be an application that uses some type of sequence, maybe for running periodic calibrations.  By taking advantage of the composite pattern, you can develop fairly generic code for running sequences that can be easily repurposed (reusable) and extended to fit the needs of a broad variety of problems.  And to be honest, the code is often more highly readable as the important steps are encapsulated in a single execution method.

LabVIEW is a high level language indeed.  But much like other high level languages (Python?), applications developed can still benefit from the proper use of patterns.  Simplicity is good.  But simplicity at the expense of scalability, reusability or extensibility is not.  At some point, all of us engineers of the non-software variety need to step up and learn some basic principles of software development.  And learning when and where to use patterns seems to me to be one of those principles, if not only to learn when, but also when not...

0 Kudos
Message 11 of 14
(1,160 Views)

If you've learnt how to design software, then learning patterns is not a bad thing (in fact checking out all sorts of things from the outside world is good). I prefer it in that order tho'. You could look at my code and probably recognise lots of patterns, but I would have come to them by applying good design techniques rather than reading a book of patterns.

Way back when I tried applying Structured techniques and Yourdon tools to LabVIEW, in the end the block diagram was more expressive than the modelling tools so I dropped them. My guess is that after a burst of enthusiastic pattern making, you will take out/modify elements until you end up with fairly standard LabVIEW and that is a good thing!

You assert that applications can benefit from proper use of patterns, how so?, can you quantify this? If you show me an application written in LabVIEW but using patterns will it be better than just a well designed LabVIEW program?

My trade is writing LabVIEW and handing it over to my customers to modify and maintain, these customers have little LabVIEW experience. They will not appreciate me discussing the software in terms of the architecture, they are only interested in how it addresses the problem domain.

Don't mistake my rejection of the GoF book as rejection of good design, I'm very big on good design. I'm just not sure applying these patterns is that beneficial to LabVIEW.

mtat76 wrote:

  At some point, all of us engineers of the non-software variety need to step up and learn some basic principles of software development.  And learning when and where to use patterns seems to me to be one of those principles, if not only to learn when, but also when not...

Completely agree! I would also add that we should be able to generate our own patterns, ones more applicable to a graphical/dataflow language.

Steve


Opportunity to learn from experienced developers / entrepeneurs (Fab,Joerg and Brian amongst them):
DSH Pragmatic Software Development Workshop


Random Ramblings Index
My Profile

0 Kudos
Message 12 of 14
(1,160 Views)

My trade is writing LabVIEW and handing it over to my customers to modify and maintain, these customers have little LabVIEW experience. They will not appreciate me discussing the software in terms of the architecture, they are only interested in how it addresses the problem domain.

We must be in the same trade! But do you find it completely necessary to code such that anyone might be able to understand it.  Sounds to me like that is a recipe for disaster.  I have been writing code for a while (probably not as long as you, Steve ), and had a passing association with LV.  I was "maintaining" some LV code that another engineer had developed to run an instrument and I hated it!  Despite my experience and the fact that I understood the algorithms better than the developer did, I felt like the block diagram completely obfuscated the meaning of the code and that this could have been corrected with a good dose of Java or C.  Ten years later and a good deal of LV development under my belt, the code seems quite trivial (and simple).  So was it reasonable for that developer to code such that it could be easily understood by the customer (me) who had little experience with LabVIEW? 

I am often hired to solve complex problems.  And sometimes I am hired to solve problems that seem quite trivial but turn into something more difficult.  My customers have some knowledge of LabVIEW and hope to be able to some extent to solve problems that arise.  But, ultimately they hire me because they know that I can solve unexpected problems that arise much more swiftly and in a more robust manner.

And although sometimes my customers don't necessarily want to spend a lot of time discussing it, they do value good architecture.  Especially when I can point them quickly in the direction of where to find the problem.  And then, if the customer is not interested in architecture, how can you expect them to maintain the application you developed?  And how can you expect to help them if you attempt to jump into a problem which they have tried to solve without regard to architecture?   Architecture may cause people's eyes to glaze over, but those who disregard it do so at their own peril.  Would we be so cavalier about this in any other field?  Could you imagine handing over a circuit board that you developed to a customer and say "just go do it!" without first having handed over design documents?

You assert that applications can benefit from proper use of patterns, how so?, can you quantify this? If you show me an application written in LabVIEW but using patterns will it be better than just a well designed LabVIEW program?

When I first started coding in LabVIEW, I read bits and pieces of your book and greatly appreciated it.  In software engineering, there are many ways to solve the same problem.  Regarding patterns, I have given an example above.  I would also put forth the command pattern.  We can develop something similar using variants and strings, but for me having the execution logic tied up in a single main VI in an object instance is easier to understand and easier to point to if modification is needed.  Similarly, we can look at strategy patterns for swapping algorithms.  I have found myself applying these patterns over and over again to my and my customer's benefit.

Can I quantify this?  Hmmmmm...I don't know.  From a reusability standpoint, I find myself repeatedly dropping code that is encapsulated by these patterns in a variety of different settings.  I also feel that sometimes this encapsulation allows the customer to play with the aesthetics rather than being concerned about the architecture.  So, in these cases, it allows them to be less concerned about whether they are about to destroy something structural if they attempt to make a change - ready extension of the design allows them to build their own sandbox which can be swapped in and out as needed.  But, I have never attempted to quantify this mostly because this is just how I code and going another route would be highly inefficient for me.

You may disagree and come up with a different solutions to the same problem that utilize some of the basic "patterns" defined in the LabVIEW templates (state machine, consumer producer, etc. etc.) and that's fine.  It's not clear to me that these are necessarily any more understandable than using common OOD techniques, or if these are just another paradigm.  In the end, we all have to learn if we hope to keep up.  Even our customers.  All software problems are simple.  Until they get hard...

0 Kudos
Message 13 of 14
(1,160 Views)

Really loved reading this, it's actually a discussion that we need to have.

mtat76 wrote:


We must be in the same trade! But do you find it completely necessary to code such that anyone might be able to understand it.  Sounds to me like that is a recipe for disaster. 

This comes back to writing code to be read, (this could be an argument for patterns too). One of the gifts LabVIEW gives us is a block diagram that can express how we have solved this problem. It can also express how we've architected the solution. The inference seems to be that because I'm not employing GoF patterns I've not considered architectures or design. I just feel that they are an abstraction too far for most jobs (certainly all I have worked on). Writing code your customer can read makes your code easier to maintain, extend, scale etc etc.

mtat76 wrote:

And although sometimes my customers don't necessarily want to spend a lot of time discussing it, they do value good architecture.  Especially when I can point them quickly in the direction of where to find the problem.  And then, if the customer is not interested in architecture, how can you expect them to maintain the application you developed? 

In the ideal world the architecture, like the operating system should be invisible.

mtat76 wrote:

So, in these cases, it allows them to be less concerned about whether they are about to destroy something structural if they attempt to make a change - ready extension of the design allows them to build their own sandbox which can be swapped in and out as needed.  But, I have never attempted to quantify this mostly because this is just how I code and going another route would be highly inefficient for me.

I'm keen to push people to quantify their claims, because there is a little bit of religion creeping into discussions of methods, processes and design. I think your qualification above is perfectly sound, you use what is efficient for you, we all do. My company has 3 engineers all writing code the same way, it might not be the most advanced way in the world, but the fact we all write this way is very powerful. We have 200+ projects in many different fields and do very little maintenance, it works for us. It may not work for someone working by themselves in a company in just one problem domain. It's not perfect, but it's pretty good.

mtat76 wrote:

All software problems are simple.  Until they get hard...

Tis True!

Steve


Opportunity to learn from experienced developers / entrepeneurs (Fab,Joerg and Brian amongst them):
DSH Pragmatic Software Development Workshop


Random Ramblings Index
My Profile

Message 14 of 14
(1,160 Views)