LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

LabVIEW Roadmap (2022)

I don’t want to sound ungrateful, but that is rather underwhelming.

When talking about NXG, NI was pretty straight forward describing LabVIEW faults and it’s future (and end).

With the end of NXG, we are looking for a longer term view of LabVIEW, it’s future (or lack of it) and what NI sees as the future of software (20 year view).

Stu
Message 41 of 105
(2,818 Views)

wiebe@CARYA wrote:

BTW. This post Re: Code Interface Nodes - NI Community (your first on the account you're using here) is dated ‎09-01-1999.


I even found times, when Rolf used to love CINs rather than DLLs

 


@rolfk wrote:

Steven Harrison wrote:
CINs existed in LabVIEW long before DLLs were in vogue, and we've maintained support for them through the current version with no plans to stop doing so. But I'm interested in hearing from the LabVIEW community now. Speak up if you prefer CINs, and tell us why. (Also speak up if you prefer DLLs).

CINs give me a much more intimate control over the data passed from and to LabVIEW and unless I use native data types (and maybe even then) in LabVIEW, DLL parameters seem always to be copied for the actual call, where as CIN parameters CAN operate in place.
The Win32 LibMain function gives also some control over the initialization of DLL data space but not in the way the CIN functions do. And last but not least I can easily write and have done so many times one CIN source for multiple platforms. If you ever drop CIN support you will have one disappointed guy here.

Amazing! 😄

0 Kudos
Message 42 of 105
(2,654 Views)

@dadreamer wrote:

wiebe@CARYA wrote:

BTW. This post Re: Code Interface Nodes - NI Community (your first on the account you're using here) is dated ‎09-01-1999.


I even found times, when Rolf used to love CINs rather than DLLs

 


@rolfk wrote:

Steven Harrison wrote:
CINs existed in LabVIEW long before DLLs were in vogue, and we've maintained support for them through the current version with no plans to stop doing so. But I'm interested in hearing from the LabVIEW community now. Speak up if you prefer CINs, and tell us why. (Also speak up if you prefer DLLs).

CINs give me a much more intimate control over the data passed from and to LabVIEW and unless I use native data types (and maybe even then) in LabVIEW, DLL parameters seem always to be copied for the actual call, where as CIN parameters CAN operate in place.
The Win32 LibMain function gives also some control over the initialization of DLL data space but not in the way the CIN functions do. And last but not least I can easily write and have done so many times one CIN source for multiple platforms. If you ever drop CIN support you will have one disappointed guy here.

Amazing! 😄


There probably was a lot less control over DLL parameters back then.

0 Kudos
Message 43 of 105
(2,639 Views)

@dadreamer wrote:

wiebe@CARYA wrote:

BTW. This post Re: Code Interface Nodes - NI Community (your first on the account you're using here) is dated ‎09-01-1999.


I even found times, when Rolf used to love CINs rather than DLLs

 


@rolfk wrote:

Steven Harrison wrote:
CINs existed in LabVIEW long before DLLs were in vogue, and we've maintained support for them through the current version with no plans to stop doing so. But I'm interested in hearing from the LabVIEW community now. Speak up if you prefer CINs, and tell us why. (Also speak up if you prefer DLLs).

CINs give me a much more intimate control over the data passed from and to LabVIEW and unless I use native data types (and maybe even then) in LabVIEW, DLL parameters seem always to be copied for the actual call, where as CIN parameters CAN operate in place.
The Win32 LibMain function gives also some control over the initialization of DLL data space but not in the way the CIN functions do. And last but not least I can easily write and have done so many times one CIN source for multiple platforms. If you ever drop CIN support you will have one disappointed guy here.

Amazing! 😄


Wiebe nailed it. This was in 2000. With LabVIEW 6.0 just barely released.

 

LabVIEW 5.1 had been the first version where NI had dropped support for Windows 3.1 and MacOS 68K and that allowed to let the Call Library Node finally have some more support for datatypes that previously had been pretty impossible to pass through the DLL interface since you had to do marshalling for every memory pointer that you wanted to pass that way. And that marshalling was a lot more hairy than what a .Net library nowadays has to do when calling unmanaged code. It involved actually calling assembly code to do all kinds of nasty things to make the memory pointer valid for the segmented protected memory model that the DLL was really operating in and after the DLL call it all had to be reversed back properly.

 

LabVIEW 6.0 was the first version where calling a DLL was slowly getting on par with calling CINs if you wanted to work with native LabVIEW memory buffers. 

 

It was not before around 2011 that I completely abandoned CINs, when taking over maintenance of Lua for LabVIEW, which had been developed by a colleague of mine using CINs and I was pondering the pro's and con's of continuing using CINs or rather take the plunge to convert it all to a shared library instead. It had gotten apparent that CINs were by now a legacy technology for sure and that official support for them was not an option for the new LabVIEW 64-bit version anymore. After having done that conversion I have never looked back at CINs anymore other than as a curiosity. 😀

 

If you really want to know more about useless historical details in that respect you may be interested to have a look here:

 

https://blog.kalbermatter.nl

Rolf Kalbermatter
My Blog
Message 44 of 105
(2,626 Views)

I have already read through your articles years ago, when they were on expressionflow back then. Just after I've seen your numerous posts about forgetting CINs and switching to DLLs (CLFNs) instead since 2002 or so, I hardly believed, that in former times the things were the other way round and you had (almost) nothing against CINs. 🙂 Lua for LabVIEW has caught my eye a couple of times and from what I can remember, the developers were using FatCIN method for cross-platformity, that seemed interesting to me. But I never had a chance to apply it in my own projects. Now I could try it all having many LV distros on many virtual machines, but I don't have much desire for that already, given that nobody uses CINs these days, small part of LV users knows about CLFNs and uses them and even a smaller part of those people uses their own DLLs and knows how to compile them.

0 Kudos
Message 45 of 105
(2,576 Views)

The main reason CINs were a lot better in the early days was that they operated in the same memory model than LabVIEW itself. Windows 3.0 and 3.1 used a 16:16 segment-offset pointer model, which was pretty much dictated by the fact that Windows was just a "pretty" graphical shell on top of good ol MS-DOS. When NI decided they wanted to port LabVIEW to Windows, they quickly determined that the segment-offset memory model was not going to cut it for them. Back then you had so called DOS extenders. That were software drivers that plugged into DOS and an application that was aware of it could then make use of the according interrupts to make use of a flat 32-bit memory model with protected mode memory. There was (and still is) even a 16:32 segment offset memory model supported in the Intel CPUs. Some of the DOS Extenders supported that too but it never gained any traction. 32-bit flat memory proofed to be so much easier to deal with that by the time when the limitations of a 32-bit memory space started to be noticable, nobody had any interest to go back to those atrocities of a segment-offset memory model and they went instead directly to 64-bit.

 

Watcom acquired the rights to distribute one of those DOS extenders with their C compiler (actually they got the rights to distribute a lightweight version of it and you could then upgrade to the real thing from the original developer if you were so inclined). NI decided to do their entire porting work to the Watcom supported flat 32-bit memory model just as they had been already using for the 32-bit Motorola 68k Macs. Working with a flat 32-bit memory model was so much more comfortable than having to deal with segments and offsets. Arrays that were bigger than a segment could not be addressed as single entity but you had to observe the segment boundaries and do operations on such arrays in junks to avoid crossing segment boundaries, which the CPU did not like at all. CINs worked in that memory model too and could receive, modify, resize and deallocate memory buffers directly in the same way than LabVIEW itself did. This came with a cost though, only the Watcom C compiler knew how to create according object code that was directly compatible with this memory model, so CINs for LabVIEW for Windows could only be created with Watcom C. Watcom C also provided special versions of LoadLibrary() and GetProcAddress() that were hand assembled to allow loading the 16 bit Windows DLL in such a way that they could operate in their proper 16 bit environment but you still could pass parameters from and to them from the 32-bit memory process. Except that for arrays and strings you couldn't just pass the pointer down but you first had to call special Watcom functions to map the memory pointer into the 16 bit memory space, call the function with this mapped address and after the function returned, remove the memory mapping, so that the resources could be reused again for other things. And those arrays better didn't exceed the size of a segment or you needed to do even more work for it to work. It was hairy and prevented the possibility to pass native LabVIEW handles to a DLL. And that made it also seriously less performant than if you would call a CIN.

 

Once Windows 3.1 was out of the way (and the 68k Macs), things started to be a lot more friendly. Now LabVIEW was working in the same memory model as the underlying OS and any DLL you might ever want to call. A memory pointer in LabVIEW was the same memory pointer in a DLL! No complicated mapping, conversion and what else anymore.

Rolf Kalbermatter
My Blog
Message 46 of 105
(2,544 Views)

wiebe@CARYA wrote:

@rolfk wrote:

@Data-MetaData wrote:

I guess min. for the last 15++ years, right? 


It depends what you want to count. I'm working with LabVIEW since 1992, started in LabVIEW 2.2.1 when I started as application engineer for National Instruments in Switzerland.

 

If you only account for online forums such as this here and LavaG, it's pretty much since 20 years. If you also allow for Info-LabVEW, which was a mailing list and still exists, but isn't very active anymore, then it approaches more like 28 years. I prefered the mailing list for quite some time as always connected Internet wasn't a standard thing back then. 😁


pica.army.mil?


info-tabview.org (for the last many years)

LabVIEW ChampionLabVIEW Channel Wires

0 Kudos
Message 47 of 105
(1,817 Views)

@sth wrote:

wiebe@CARYA wrote:

@rolfk wrote:

@Data-MetaData wrote:

I guess min. for the last 15++ years, right? 


It depends what you want to count. I'm working with LabVIEW since 1992, started in LabVIEW 2.2.1 when I started as application engineer for National Instruments in Switzerland.

 

If you only account for online forums such as this here and LavaG, it's pretty much since 20 years. If you also allow for Info-LabVEW, which was a mailing list and still exists, but isn't very active anymore, then it approaches more like 28 years. I prefered the mailing list for quite some time as always connected Internet wasn't a standard thing back then. 😁


pica.army.mil?


info-tabview.org (for the last many years)


https://info-labview.org (dang autocorrect)

LabVIEW ChampionLabVIEW Channel Wires

0 Kudos
Message 48 of 105
(1,811 Views)

Hi,

 

I have the same question as Stu.

There was a reason that NXG was started : the foundations of LabVIEW were too old/too messy, to keep LabVIEW evolving into an modern IDE.

Now that NXG is dead, I am surprised that they have chosen to somehow keep developing onto that 40 years old code...

Seems like we'll only get a few new bells and whistles.

(However I am glad that NXG in its current form did not make it. Wasn't a fan).

0 Kudos
Message 49 of 105
(1,583 Views)

@ThomasV wrote:

Now that NXG is dead, I am surprised that they have chosen to somehow keep developing onto that 40 years old code...

LabVIEW NXG is cancelled. NXG isn't.

 

NXG is still used (and developed?) for at least the G Web Module.

0 Kudos
Message 50 of 105
(1,566 Views)