LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Would Biden administration approve of LabVIEW?

OK, the title was a bit of clickbait.  But I saw this news today:

 

White House urges developers to dump C and C++

 

It mentions how the White House Office of the National Cyber Director is urging developers to switch to "memory safe" programming languages like C#, Go, Java, Ruby, and Swift.

 

I have been coding LabVIEW for 25 years and I couldn't tell you if LabVIEW is "memory safe".  Does anyone know?

http://www.medicollector.com
0 Kudos
Message 1 of 13
(995 Views)

I would never take any advice from any politician.  If one tells you that water runs downhill, go check that for yourself.

"If you weren't supposed to push it, it wouldn't be a button."
Message 2 of 13
(976 Views)

@paul_cardinale wrote:

I would never take any advice from any politician.  If one tells you that water runs downhill...


Another one will tell you that it runs up-hill.

---------------------------------------------
Certified LabVIEW Developer (CLD)
There are two ways to tell somebody thanks: Kudos and Marked Solutions
0 Kudos
Message 3 of 13
(952 Views)

Based off of the description here:

 

https://www.memorysafety.org/docs/memory-safety/

 

I would say that LabVIEW, by itself, probably is.  However, LabVIEW runs on the LabVIEW run-time engine which might not be, and the versions of it that run on FPGAs also might not be.

 

It also has something else going for it, and that's it's specialization.  It's generally never something that's open to the regular Internet like a web server or a database, and not the sort of thing that would ever store things like banking info or social security numbers.

Message 4 of 13
(944 Views)

@josborne wrote:

...

I have been coding LabVIEW for 25 years and I couldn't tell you if LabVIEW is "memory safe".  Does anyone know?


No. If you want me to, I can crash your computer using LabVIEW in a very memory is not safe from my code way. 

______________________________________________________________
Have a pleasant day and be sure to learn Python for success and prosperity.
0 Kudos
Message 5 of 13
(943 Views)

@Jay14159265 wrote:

@josborne wrote:

...

I have been coding LabVIEW for 25 years and I couldn't tell you if LabVIEW is "memory safe".  Does anyone know?


No. If you want me to, I can crash your computer using LabVIEW in a very memory is not safe from my code way. 


By using badly configured Call Library Nodes or similar. However every language that wants to interact with any hardware or non-standard devices has some form of FFI (Foreign Format Interface) and that can be used to do the same. They can make access to such FFI hard to get at for normal users, but somehow need to have such interfaces to access the underlaying OS and other resources itself.

 

As long as you stay in LabVIEW diagram and don't use Call Library Nodes or similar without being 200% sure to know what you are doing, it is in fact pretty hard to crash LabVIEW (if it was properly installed). I only crash LabVIEW when doing development with self designed DLLs, but then about 60 times per hour during the development/testing of that. 😁

 

The problem with C/C++ is that it was not designed to be memory safe but possible to make as performant as possible on any of the zillion CPU architectures out there. And it pretty much is. But that comes with enormous power on what you can do with memory and if you are not very very careful, you create buffer overflow errors, invalid address access and similar in a blink of the eye and the compiler won't complain since there is no mountain guide holding your hand or slapping your face if you try to do such things.

 

It doesn't mean that you can't write safe software in C/C++ but it is pretty hard for most human programmers (and no AI won't help with that for a long time. AI as we have it now only can suggest code that has been written before already and since humans are on average pretty bad at that, most of what AI has been trained on is equally bad.)

Rolf Kalbermatter
My Blog
Message 6 of 13
(909 Views)

"Biden administration calls for developers to embrace memory-safe programing languages and move away from those that cause buffer overflows and other memory access vulnerabilities."

 

Now if only our elected leaders were memory-safe and did not have buffer overflows and other memory access vulnerabilities..... 🙂

 

-AK2DM

~~~~~~~~~~~~~~~~~~~~~~~~~~
"It’s the questions that drive us.”
~~~~~~~~~~~~~~~~~~~~~~~~~~
Message 7 of 13
(798 Views)

This has probably been fixed, but a few years ago, a post on lavag.org found a way to exploit a VISA buffer overflow. Can't say I understand it, but I assume this is non memory safe.

 

https://lavag.org/topic/19527-smash-call/

0 Kudos
Message 8 of 13
(768 Views)

Well even in Rust applications you can have bugs, especially if your interface to external libraries such as this VISA or anything else. It’s hard to create memory bugs in Rust as long as you program in Rust but not impossible, and any interface to another library including OS APIs can introduce new bugs that Rust can do nothing about to detect or prevent.

Rolf Kalbermatter
My Blog
Message 9 of 13
(749 Views)

Thanks for all the comments!  I think this issue of "memory safety" is still a bit hard to grasp.  But this ONCD report is about cybersecurity not about bugs.  So the ONCD isn't worried if you create buggy code, its worried that hackers can take advantage of "non-memory-safe" code to cause damage.... right?  

 

And don't get me started on the how LabVIEW isn't vulnerable because its not widely used nor connected to the Internet servers etx.  That is what Siemens thought before Stuxnet.  My code is deployed in hosptils where Nation-states are actively attacking us right now.

http://www.medicollector.com
0 Kudos
Message 10 of 13
(734 Views)