LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

LabVIEW vs. Python for Test Automation


@daphilli wrote:

Prior to LabVIEW 8.6, I favored C++. But, after NI added OOP, VI Server and recursion, LabVIEW became viable for my purposes. E


I'm pretty sure VI Server and recursion have been in LabVIEW since LV4.

 

You probably refer to the VI Server extension with control references (allowing property nodes and method nodes, replacing attribute nodes that didn't have references) and the improved recursion (where a VI can call itself directly without the need to use VI Server).

 

I agree things got much easier.

Message 11 of 20
(1,730 Views)

@rolfk wrote:

@Jay14159265 wrote:

I think that if LabVIEW does not get AI code generation it will soon ( if not already)  fall behind text based languages in speed of development. 


Ever heard of Fuzzy logic? It was the BIG thing some 20 to 25 years ago, and it seemed that you could not sell a coffee machine or washing machine anymore if there wasn't a "uses Fuzzy logic for a better cleaning result (or better tasting coffee)" label on it. There also frequently came the argument that it would preserve energy and detergents as it only was using just the right amount of energy and time.

 

Fuzzy logic was in principle just a bit of an adaptive algorithm and a pretty crude one at that too. More adaptable than a pure on/off control, but all of course depended on the sensors, if they were even present. Quite frequently that fuzzy logic was just a little control loop somewhere in the firmware to be able to claim that the device used this new great super power technology, and sometimes it was just a label put on the device anyways, freely according to the motto "paper is very patient" and who in the world would ever go to the effort to take apart a device to make sure it actually contained this super tech?

 

I'm not saying that AI is without any merits. But much of it is hype anyways, and a self supporting hype at that. Someone found out this new invisible cloth and everybody is wanting to jump on the bandwagon to gain a bit of profit too, and nobody so far has dared to say: "But he is not wearing any clothes!"


IIRC, Fuzzy logic had one big disadvantage: it was hard to prove that it worked or worked correctly.

 

Multiply that uncertainty by 100, than square the results. That's where AI's at.

 

If this really helps development in text languages, NI would be able to speed up LabVIEW development, maybe even implementing AI in LabVIEW!

 

"Hey ChatGPC, add yourself to the LabVIEW source" 😂

0 Kudos
Message 12 of 20
(1,727 Views)

wiebe@CARYA wrote:

@daphilli wrote:

Prior to LabVIEW 8.6, I favored C++. But, after NI added OOP, VI Server and recursion, LabVIEW became viable for my purposes. E


I'm pretty sure VI Server and recursion have been in LabVIEW since LV4.


That was not really VI Server as we know it. LabVIEW 4 added some CIN based VI libraries that allowed you to open VIs programmatically and even pass values to the controls and retrieve them by the control label. It was indeed creating the name VI server as a thing, but the actual VI Server technology as we know it nowadays was introduced in LabVIEW 5.

Rolf Kalbermatter
My Blog
0 Kudos
Message 13 of 20
(1,713 Views)

@rolfk wrote:

wiebe@CARYA wrote:

@daphilli wrote:

Prior to LabVIEW 8.6, I favored C++. But, after NI added OOP, VI Server and recursion, LabVIEW became viable for my purposes. E


I'm pretty sure VI Server and recursion have been in LabVIEW since LV4.


That was not really VI Server as we know it. LabVIEW 4 added some CIN based VI libraries that allowed you to open VIs programmatically and even pass values to the controls and retrieve them by the control label. It was indeed creating the name VI server as a thing, but the actual VI Server technology as we know it nowadays was introduced in LabVIEW 5.


Fair enough. My memory of these things is a bit fuzzy. Still, long before 8.5.

0 Kudos
Message 14 of 20
(1,708 Views)

@rolfk wrote:

@Jay14159265 wrote:

I think that if LabVIEW does not get AI code generation it will soon ( if not already)  fall behind text based languages in speed of development. 


Ever heard of Fuzzy logic? It was the BIG thing some 20 to 25 years ago, and it seemed that you could not sell a coffee machine or washing machine anymore if there wasn't a "uses Fuzzy logic for a better cleaning result (or better tasting coffee)" label on it. There also frequently came the argument that it would preserve energy and detergents as it only was using just the right amount of energy and time.

 

Fuzzy logic was in principle just a bit of an adaptive algorithm and a pretty crude one at that too. More adaptable than a pure on/off control, but all of course depended on the sensors, if they were even present. It's main advantage was that you could "tune" it with some simple trial and error, while for PID controllers you needed a deeper understanding of the entire control loop and its parameters to be able to make it work reliable.

And quite frequently that fuzzy logic was just a little control loop somewhere in the firmware to be able to claim that the device used this new great super power technology, and sometimes it was just a label put on the device anyways, freely according to the motto "paper is very patient", and who in the world would ever go to the effort to take apart a device to make sure it actually contained this super tech?

 

I'm not saying that AI is without any merits. But much of it is hype anyways, and a self supporting hype at that. Someone found out this new invisible cloth and everybody is wanting to jump on the bandwagon to gain a bit of profit too, and nobody so far has dared to say: "But he is not wearing any clothes!"

 

Almost nobody seems to worry about the fact that the hardware on which ChatGPT and similar systems run seems to head into the next energy fiasko after  many of those blockchain technologies have already set a new "standard" in energy consumption. Isn't it crazy that a bitcoin sale consumes 707 kWh/transaction. That's a quart year of energy consumption for an average household here in the Netherlands! And ChatGPT and similar technologies seem to be quite large energy consumers too. It may be less than a 1/10 of a Wh per query but if you add that up with the million of queries that come in, they must be using an electric power plant already just for themselves.


Well I'm not selling bitcoin or fuzzy logic so I don't know where that is coming from. I am talking about speed of code development. What I can say from personal experience is that copilot makes developing in text based languages much faster and easier than it has ever been. I'm talking about an experienced programmer using GPT3 to develop text based code much faster than a human can type. You can do that today, here is the link, give it a try, its a free trial:

https://github.com/features/copilot

______________________________________________________________
Have a pleasant day and be sure to learn Python for success and prosperity.
Message 15 of 20
(1,662 Views)

I used a tool from Rohde & Schwarz not too long ago which constructed the sequence of driver actions in LabView code based on a manually configured sequence in a GUI tool. Not quite "AI" generated, but sort of following along the same idea.

 

Here's a set of actions I can do, list them out in a tree structure and let the machine glue it all together for you with an algorithm.

 

I forget what that was called, but at a basic level "AI generated" LabView would probably be easier to implement than AI generated text code.

 

The structures required in test automation are highly standardized right? Describe the VI hierarchy required then just wire them up in accordance with the defined specification.

 

The issue with ChatGPT which I'm concerned about is that it may lean into bad coding practices for inexperienced programmers. Yes, it will write the code for you, but you still have to communicate your logic to the interpreter. If the machine can recommend better ideas, that's neat, but I have to wonder how the machine "knows" which implementation is best. Sounds like a recipe to bake in errors across the board, and then we have a fleet of developers who have no idea how to analyze code.

0 Kudos
Message 16 of 20
(1,655 Views)


@abenevides wrote:

...

 

The issue with ChatGPT which I'm concerned about is that it may lean into bad coding practices for inexperienced programmers. Yes, it will write the code for you, but you still have to communicate your logic to the interpreter. If the machine can recommend better ideas, that's neat, but I have to wonder how the machine "knows" which implementation is best. Sounds like a recipe to bake in errors across the board, and then we have a fleet of developers who have no idea how to analyze code.


That is an interesting thought, ChatGPT is a conversational model AI so it only cares about the 'correctness' of having a conversation not necessarily the 'correctness' of code. But ChatGPT is not the only AI out there. There are AIs that are trained on codebases that write code that matches the code base they are trained on. Presently companies can have an AI that is trained on their own existing code base that will generate code that is similar to the existing code. There is some potential that there will be an AI that is only trained on bad code that will generate bad code suggestions, which is funny to think about. In general though, the way OpenAI's GPT3.5 is trained, it presently gives very good to excellent code suggestions. I think that for new developers writing with AI suggestions would lead to better code patterns than worse. 
______________________________________________________________
Have a pleasant day and be sure to learn Python for success and prosperity.
0 Kudos
Message 17 of 20
(1,624 Views)

I have only been coding in LabVIEW for the last year and a half, and while I absolutely hated it starting out, I have found that there are a lot of really nice things that can be done.

 

I do some relatively complex coding, and particularly for bouncing back and forth and trying to pull different items for different customers, a text based language would be much faster. Which makes the formula node a major go to.

 

I am planning to do some combination to accomplish tasks in the most efficient way possible.

Message 18 of 20
(1,618 Views)

@abenevides wrote:

The issue with ChatGPT which I'm concerned about is that it may lean into bad coding practices for inexperienced programmers. Yes, it will write the code for you, but you still have to communicate your logic to the interpreter. If the machine can recommend better ideas, that's neat, but I have to wonder how the machine "knows" which implementation is best. Sounds like a recipe to bake in errors across the board, and then we have a fleet of developers who have no idea how to analyze code.


Decades ago, (besides fuzzy logic, ) genetic algorithms where all the rage.

 

These might be a lot easier to apply to LabVIEW, although GPT (I'm told) should work on any training set that can be arranged in a tree.

 

The idea is that there's a population (gene pool) with phenotypes (the DNA). A fitness function selects the best specimen from the population, and these 'procreate'. The fitness function can be user input though, or partially human input.

 

I've seen for instance image generators, where the user can select from a set a few images they like. The next generation of images will be combinations and mutations of the selected images. Repeat until it's good enough. The phenotype is a list of functions, shapes, filters, functions, etc. It always made me think VIs could (potentially) be constructed in a similar way.

 

This would be able to run on even the most modest PC, and wouldn't require a cloud service, and selling your soul to some big corporation.

 

Of course fuzzy logic, genetic algorithms and neural networks can be combined.

0 Kudos
Message 19 of 20
(1,567 Views)

@carq wrote:

I have only been coding in LabVIEW for the last year and a half, and while I absolutely hated it starting out, I have found that there are a lot of really nice things that can be done.


👍

 


@carq wrote:

I do some relatively complex coding, and particularly for bouncing back and forth and trying to pull different items for different customers, a text based language would be much faster. Which makes the formula node a major go to.


That's a bit puzzling, and raises many questions. But then, I had 25 years to get answers to those questions. 😉

 


@carq wrote:

I am planning to do some combination to accomplish tasks in the most efficient way possible.


Sure.

 

It's not a bad idea (at all), but the success will depend a lot on the execution.

0 Kudos
Message 20 of 20
(1,564 Views)