I got interested in what is the penetration of NI "solutions" at different sectors of the industry/ R&D. I see NI products at many places used, but of course there are also plenty of exceptions. I am specially interested in the reasoning, how a certain company decides to use other than NI approaches. I guess senior developers in this forum have plenty of stories and info about such topic, that is why this post! 🙂 I love to hear stories, anecdotes, etc 🙂
I do not want to start a debate about which approach/platform/HW solution/software/language is better, but collect/hear info about how nowadays companies decide and reason their selections, for their production and R/D...
Recently I started to more actively search for new challenges (job) from the Q3 in 2018, and I got already some contact requests from head-hunters and companies. During a recent interview, the guy was explaining their profile, their projects and what kind of software and hardware solutions they use. The guy mentioned that they mainly use C#, and hardware solutions based on PIC and ARM micro-controllers.
The projects sounded fun what they showed me (physics research involved), and just out of pure curiosity, I asked the interview guy: "have you ever considered solutions from NI at your company?" "You know, I have a CLD and would be fun to develop using LabVIEW if you hire me!".
Actually the response was this: "The cost is not the problem, but NI platform/HW is not reliable, and not flexible enough to fulfill our needs".
That was the response, and because this topic was not the point of this interview, I could not get more detailed info, even if I was very curious about the reasoning 🙂
Beside cost (specially if large system parts need to be changed/"migrated"), I have the feeling that many companies even are not aware of the recent available technology solutions from NI. Maybe 20-30 years ago they made a decision based on that actual situation/technology level, and now it is like "a tradition" that they got stuck with whatever they use since. I do not say any approach is better or worse (often they are even not comparable, as apple to orange...), I am purely interested how they make their decisions, and what is the "inertia" of certain decisions for... For example, I imagine, if I have lots of C# programmers employed, it would be also silly to start using LabVIEW for dev 🙂
At another place, when they argued, "NI HW is not reliable, so we do not use it", after some more questions and discussions, I just managed to figure out how to "decode" this response: "we put a beginner/student to develop software using LabVIEW, and we realized the result was a not reliable system!" 😄
At my recent institute: here the story is simple, I understand why our "automation and measurement" group is stuck with Siemens platform: they have experience with it, and to change the huge amount of already deployed Siemens hardware (mainly PLC based control/DAQ systems) in this big facility, could cost serious money.
But hey, have you ever seen a recently sold (!!) Siemens PCS7 process visualization software? 🙂 Crazy bad, GUI is like from the last century, graphing is a disaster, data export is a joke 🙂 And costs pretty much money! 🙂 But anyway, this is the tradition here... 🙂
EDIT: Aw crap I wrote this big long post then realized I had the conversation backwards. "Why don't you use NI for solutions" I answered as if someone asked me why I use NI (which is something I've been asked plenty of times)
I might be a bit biased...but I could answer that question a couple of ways.
The rapid prototype nature of LabVIEW and NI means you can get going taking readings and processing data very quickly. In a bench top and small lab environment this means the engineers are the programmers, and separate people aren't needed for programming. Once you want to move to more industrial solution you can take the code already written and make sequencing testers either with TestStand or a sequencer in LabVIEW. Now full reports can be generated using the simple code thrown together just to take readings. Then that code can be brought up to HIL modeling, or production testers, and integrated with other equipment. The same software that was once written to test a thing quickly on a bench, caries through the life of the testing of that product. You aren't reinventing the wheel, and these parts of the code have been tested and improved over time.
One programming environment for Windows, FPGA, and RT means code written for an FPGA can be ran in Windows in seconds. Code that was written for one target can be ran on another supported target in many cases with little or no changes. There are certainly constrains on the different targets, but I don't need to learn a new IDE to learn FPGA. This again is usually a thing for a dedicated programmer, and not an electrical engineer.
I could also say the G nature of LabVIEW is easier for humans to understand and read but that could be argued either way.
There is also the high emphasis on device connectivity with drivers and APIs for just about everything out there. Projects when designed well scale well and adding new devices or equipment can be easy.
Being able to adapt when changes come is important. Changes will happen as users want different things from the software. Being able to adapt quickly is another huge benefit of LabVIEW and NI. Adding new sensors, or another code module that does something on a well designed system is pretty easy.
Don't mind it Hooovahh! 😄 Actually my post is a bit too long, and not properly structured. I could have formulate my thoughts better 🙂
edit: so actually i am interested about what kind of replies you got from people, reasoning why they do not use NI stuff?
edit2: and your reply still useful, and gives me more insights about the benefits of NI ecosystem...
One of the biggest (legitimate) complaints about NI hardware that I've seen is that a lot of their analog input cards lack any type of signal conditioning or anti-alias filtering. There are several people here who refused to touch any NI hardware for years because of that.
See here for the NI Week paper I wrote about a rather unique project I was blessed to involved in.
A instrument manufacture was looking to speed up their process development and engaged me to develop a LV based version of their instrument in parallel with the existing in-house development team that where doing their work "the traditional way" using C# and multiple interacting PC boards. With only a little help from two of my peers I finished the entire project ahead of schedule and under budget. The in-house team was still having trouble when I moved on to my next project.
Why did they walk away from the approach we delivered using NI hardware?
From what I was told, the move from C# to LabVIEW was too drastic and would have required telling hot-dog C# developers to forget what they knew and loved and to get good at LabVIEW. The ramp up time for the team to convert was seen as an issue. Additionally there was the risk of the experienced engineering staff to abandon ship and find another employer that would let them code in C#. And then there was the glaring problem that ten in-house C# developers would be replaced by two LV developers and laying off 8 people. IN many companies a managers status is judged by how many people answer to them and how much money they manage.
So that is one case...
I have been working with some big (as in production lines measured in miles) steel companies lately and have found that they are still running VMS on VAXes because the process of making steel has not changed and the Fortran code running under VMS works just fine.
Hoping I have hit your mark,