LabVIEW Development Best Practices Discussions

Showing results for 
Search instead for 
Did you mean: 

Are patterns good, always?


I thought that I would throw this out there as there are probably not a lot of users here who are memebers of the LinkedIn group "Software Design Patterns and Architecture".  There was recent discussion that I thought people might be interested in as there has been a lot of conversion of patterns from the GoF book into G.  The discussion was entitled "Are patterns good, always?" and it was put otu by one of the owners of the site Refactoring for Software Design Smells.  The discussion is duplicated on their site at

One of the interesting comments that resonated with me in the LinkedIn discussion was put out there by Allen Holub

"To me, you get flexibility, not by trying to second guess every eventuality up front (which is the goal of many of the patterns), but instead by writing the code in such a way that it's easy to change it. Implementing a complex pattern that turns out to be unnecessary (i.e. it gives me flexibility that I didn't need) is waste.

So, I start simple, and add patterns to a bit of code when I find myself refactoring it to handle a new requirement. I figure that if I've modified it once, I'll probably modify it again, so the added complexity of the pattern will be justified by making it easier to accommodate future changes. This is a rule of thumb, so there are, of course exceptions---places where the pattern approach is obviously the best first-pass approach. Those are the exceptions, however.

So, no, Patterns are not good in all (or even most) situations, but they're invaluable where they're needed."

My feeling is that as newbs, some of us may attempt to grab a book such as that put out by the GoF and go wild with it, trying to see patterns everywhere and bend our code to the patterns thus eschewing a more natural style that may be a better fit for the problem at hand.

I won't name names, but someone here gave similar (and just as good) advice when we many programmers were becoming pattern aware and it went something like this - the best way to use the GoF book is to read it and then throw it away.

Any thoughts?

Message 1 of 14

I would suggest skipping the GOF book in favor of Robert C. Martin "Agile Software Development, Principles, Patterns and Practices". Sometimes this approach is referred to as Continuous Refactoring. The quote from Allen Holub is well in line with it. Both books are tough reading for LabVIEW Developers - all examples are in C++ and Java.

I did a 1 hour presentation on Agile Design Principles at the CLA Summit in Austin last year. You can find the slides here. It gives a brief overview of the five main Design Principles (a.k.a. SMART Principles), touches on Data Flow specifics,  provides several LabVIEW examples and links to the more important (IMHO) pages in R.C. Martin's book ...

I started reading it back in 2010, am still fetching it from the shelf on a regular basis and, so far, have no intentions of throwing it away

Message 2 of 14

I think patterns are just a formalisation of common scenarios, which in themselves are just extensions of more fundamental good design practices (SOLID, DRY, etc). Most patterns I have seen documented rely on the same basic prinicples of abstraction, encapsulation, delegation etc. simply viewed from a different context.

Leveraging design principles will always tend to increase complexity with the intent of also increasing flexibility. We should only implement this flexibility becuase we foresee a realistic need for it.

People do tend to get caught up in patterns. I'd much rather people got caught up in the basic design principles instead and only jump to a pattern when it is especially appropriate.

Message 3 of 14

+1 for Robert's book. There is a C# -based version (updated by his son I believe) which may be slightly better reading for LabVIEW developers.

0 Kudos
Message 4 of 14

Hi Dmitry,  I agree that pattern can be generally good. It is always good to leverage on previously established solution. I do find that many times the design patterns are not very readable. It is difficult to understand the design pattern logic from the code if I don't have a previously documentation of what design pattern is used. Another issue I have is with code efficiency. I work most of the time with image processing and signal processing algorithms. I did some code with OO design and I find it difficult to follow basic image processing design considerations and OO design considerations in the same time. I haven’t seen design patterns that take both into account for LabVIEW. Issue with signal processing / image processing applications that code can be very inefficient (memory usage and speed) if I don't take those considerations in the design. I guess I need to develop my own version of those design patterns to answer both requirements. But it can be demanding effort to do that and needs high expertise in OO Design patterns.  Thanks - Amit,

Amit Shachaf
0 Kudos
Message 5 of 14

tyk007 wrote:

People do tend to get caught up in patterns. I'd much rather people got caught up in the basic design principles instead and only jump to a pattern when it is especially appropriate.

Agree. That is my preference, also. Having said that, for novices, it is a lot easier to get them caught up in patterns than in the design principles... even after you understand the principles, recognizing how to apply them can take years of experience. In my experience, a novice and even a mid-level developer caught in unneeded layers of patterns is still writing much better code than the developer who eschews the patterns and tries to develop architecture from first principles.

0 Kudos
Message 6 of 14

AQ, I am not sure I am following this last statement.  You say agree but then follow that up with a statement that seems to contradict tyk007's.  Are you suggesting that a novice ought to make efforts to mold their code to conform to patterns rather than let their coding style evolve based on how they understand basic design principles?  To be honest, this almost seems like a recipe for maintenance disaster.  Patterns applied improperly can obfuscate simple code thus creating a nightmare down the line, especially if there are "unneeded layers of patterns"...but, maybe I am misunderstanding?

Message 7 of 14

I'm saying that teaching someone the principles and their implications is very hard.

Teaching the patterns is easy by comparison.

Code created from excessive pattern appliation is frustratingly complex, but often works. It is far better than code where people try to create their own structures, which is what would happen if you just said, "here's the principles, try to apply them." So an over-reliance on the patterns is something I would encourage. I would prefer something like what you describe, but I think it is something you work toward, not something you start out with.

0 Kudos
Message 8 of 14

My preference, I think, is to introduce the basic design prinicples that we can all agree seem easy to understand but are much trickier to apply well in practice. Then introduce patterns, which are common solutions that happen to exhibit many (if not all) of these design prinicples. Encourage people to understand why the patterns are so popular and why they are structured the way they are. Also encourage people to understand that a pattern applied blindly is still a maintenance nightmare.

Unfortunately I have discovered that not everyone has the diligence to continue to learn beyond the problem at hand. If good design was so easy then there wouldn't be armies of books, training providers and seminars dedicated to this topic.

0 Kudos
Message 9 of 14

Personally I agree with your opening statement. I see patterns as being useful when discussing patterns with other people who use patterns, I see little use in applying the GoF book to LabVIEW. It just pushes the programmer away from the problem domain. For me the key to removing complexity from software is getting the block diagram to visibly describe the problem it is trying to solve.

This quote sums it up nicely.

The computing scientist’s main challenge is not to get confused by the complexities of his own making.

          — E. W. Dijkstra

So if the pattern matches your problem or if you are a member of a pattern appreciation club then go for it, but LabVIEW is a VERY high level languages and the majority of the time your are adding complexity where it is not needed.

Message 10 of 14