From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Call Library Function - DLL with specific data types defined in .h files

Hello,
 
I've been trying to use a dll provided to me within a labView program. The dll requires using specific data types defined by .h files that have been provided to me. These data types are generally structures with a variety of words, floats, and chars making up its components. Is there a way to set things up so I can use a call libray function node that is actually informed as to the data types from the .h files. I noted this post ( http://digital.ni.com/public.nsf/allkb/44F329CAC8052A0386256F0F0050F196 ) speaking to this on a somewhat simpler basis, but could not derive my solution from it. Does it involve including the .h files in \cintools\ or actually within extcode.h itself? I'm imagining that this may be a no-go anyway, I just wanted to know for sure. As well, no, re-programming the original dll so that the structures are decomposed within the functions themselves won't be an option at this point.
 
Anyway, much thanks if anyone can help, even if it's simply directing me to the right help documentaion (or telling me I'm trying to do something that is currently impossible).
 
 
0 Kudos
Message 1 of 12
(4,010 Views)

Hi Peter,

     It's possible to pack a U8 array in LabVIEW and pass it to a DLL (using Call Library Function) so that the DLL gets the structure it needs - assuming the structure doesn't have pointer or class elements.  If you're very lucky, it's simple, but more likely will be a bit tricky.  The .h file describes the order and number of bytes to pack in the U8 array.

The number of bytes required for each structure-element is determined by the DLL's compiler.  Rarely, but possibly, the DLL will require "byte padding" for structure elements.

Usually:

"char" and "small int" is packed as a one byt (U8).

[unsigned] "int" is probably packed as two bytes U16 & I16

[unsigned] "large int" is packed as four bytes - U32 & I32.

"float" are (usually) IEEE-754 compatible as are LabVIEW floats, may mean double-precision (DBL) or single-precision (SGL)

NOTE: AS YOU PACK THE ARRAY, ALL NUMERIC TYPES MAY NEED TO BE BYTE-SWAPPED because of "big endian"/"little endian" considerations (you should research these terms).  LabVIEW uses "Big Endian" format,

Use LabVIEW's "Cast" function to get the bytes of any basic type.

There are lots of posts on this subject, do your homework, give it a try, if (when  Smiley Wink-) you post back, please provide the .h file!

Cheers.

"Inside every large program is a small program struggling to get out." (attributed to Tony Hoare)
0 Kudos
Message 2 of 12
(3,994 Views)
peter-biomed,

I agree that it would be very helpful to see the .h file, however, depending on the data types you should be able to do this within LabVIEW.  Assuming your DLL and header are C files, to pass to a struct, you will pass a cluster.  The cluster will contain the types within the struct.

Where you may run into a problem is if your DLL uses datatypes other than the primitive types.  If this is the case, you could write a 'wrapper DLL' which would be a DLL which uses your DLL and breaks down the data types to primitives that LV can understand, and then build them back into the types used by your original DLL.
Regards,

Jared Boothe
Staff Hardware Engineer
National Instruments
0 Kudos
Message 3 of 12
(3,975 Views)


@peter-biomed wrote:
Hello,
 
I've been trying to use a dll provided to me within a labView program. The dll requires using specific data types defined by .h files that have been provided to me. These data types are generally structures with a variety of words, floats, and chars making up its components. Is there a way to set things up so I can use a call libray function node that is actually informed as to the data types from the .h files. I noted this post ( http://digital.ni.com/public.nsf/allkb/44F329CAC8052A0386256F0F0050F196 ) speaking to this on a somewhat simpler basis, but could not derive my solution from it. Does it involve including the .h files in \cintools\ or actually within extcode.h itself? I'm imagining that this may be a no-go anyway, I just wanted to know for sure. As well, no, re-programming the original dll so that the structures are decomposed within the functions themselves won't be an option at this point.
 
Anyway, much thanks if anyone can help, even if it's simply directing me to the right help documentaion (or telling me I'm trying to do something that is currently impossible).


You do not say which LabVIEW version you have but in 8.2 and 8.5 you have an Import Library Wizard which can possibly help here. It's not a full blown C compiler or anything like that so it can hickup on very complicated header files but for quite a lot of header files it will usually work.

This Wizard analyses the header file and detects what function it declares creating VIs to import those functions from the DLL in question. This should get you already quite a bit further eventhough it might not get you a complete VI interface to your DLL. Also note that LabVIEW 8.5 has an improved Import Library Wizard that can deal with more complex parameters such as structures quite a bit better.

The post you refer to is about complex return values for functions in LabVIEW generated DLLs and has nothing to do with calling DLLs in LabVIEW.

Rolf Kalbermatter


Message Edited by rolfk on 05-28-2008 08:26 AM
Rolf Kalbermatter
My Blog
0 Kudos
Message 4 of 12
(3,961 Views)

Hi Jared,

      In my experience, it's often not possible to simply pass a LabVIEW cluster for interpretation as a Struct.  Classic Windows DLLs (such as USER32.dll) expect Littlle Endian format for numeric types, LabVIEW uses Big Endian (I doubt the Call Library Function can be made "smart" enough to automatically "fix" this in calls to arbitrary DLLs;  as of LV 7.1 it wasn't).  Some DLLs (compilers) need to have data stored on even or quad-byte boundaries - so where LabVIEW will pass a U8 immediately followed by the second element in a cluser, the DLL may EXPECT byte padding after a single char, to "align" struct elements! Smiley Surprised

Cheers!

"Inside every large program is a small program struggling to get out." (attributed to Tony Hoare)
0 Kudos
Message 5 of 12
(3,924 Views)


@tbd wrote:

Hi Jared,

      In my experience, it's often not possible to simply pass a LabVIEW cluster for interpretation as a Struct.  Classic Windows DLLs (such as USER32.dll) expect Littlle Endian format for numeric types, LabVIEW uses Big Endian (I doubt the Call Library Function can be made "smart" enough to automatically "fix" this in calls to arbitrary DLLs;  as of LV 7.1 it wasn't).  Some DLLs (compilers) need to have data stored on even or quad-byte boundaries - so where LabVIEW will pass a U8 immediately followed by the second element in a cluser, the DLL may EXPECT byte padding after a single char, to "align" struct elements! Smiley Surprised



LabVIEW uses in memory whatever endianess is the prefered way for the current platform. The problem you describe only happens if you Flatten or Tyepcast between stream data (byte array) and 2, 4, or 8 byte integers since the LabVIEW Flatten and Typecast functions by default assume Big Endianes on the byte stream side and perform byte, and word swapping for 2 byte and bigger integers.

The Flatten/Unflatten function now has an endianess selector input but often is not so useful in this respect since it prepends an additional int32 length parameter for the byte stream side.

And as long as you can avoid the use of the Typecast function, there is no endianess problem at all in conjunction with the Call Library Node. Consequently it's not the task of the Call Library Node to be smart about endianess since its not its own problem at all.

Byte padding is a problem yes, but simply one you need to be aware of. It's always possible to emulate a specific higher byte alignment but never a lower than what the own system uses. So since LabVIEW uses packed bytes you can emulate any other alignment by adding dummy fillers whereas with the opposite (LabVIEW for instance always aligning to 32bits) it would not be possible to interface to packed byte aligned DLLs.

Rolf Kalbermatter


Message Edited by rolfk on 05-29-2008 11:52 PM
Rolf Kalbermatter
My Blog
Message 6 of 12
(3,908 Views)
 

rolfk wrote:

LabVIEW uses in memory whatever endianess is the prefered way for the current platform. The problem you describe only happens if you Flatten or Tyepcast between stream data (byte array) and 2, 4, or 8 byte integers since the LabVIEW Flatten and Typecast functions by default assume Big Endianes on the byte stream side and perform byte, and word swapping for 2 byte and bigger integers.

The Flatten/Unflatten function now has an endianess selector input but often is not so useful in this respect since it prepends an additional int32 length parameter for the byte stream side.

And as long as you can avoid the use of the Typecast function, there is no endianess problem at all in conjunction with the Call Library Node. Consequently it's not the task of the Call Library Node to be smart about endianess since its not its own problem at all.

Byte padding is a problem yes, but simply one you need to be aware of. It's always possible to emulate a specific higher byte alignment but never a lower than what the own system uses. So since LabVIEW uses packed bytes you can emulate any other alignment by adding dummy fillers whereas with the opposite (LabVIEW for instance always aligning to 32bits) it would not be possible to interface to packed byte aligned DLLs.

Rolf Kalbermatter


Re: Endianness, thanks for straightening me out (embarrassing as it may be) Smiley Tongue
      Your reply refers to integers explicitly, though, aren't flattened and cast floats also byte-swapped?.
 
Cheers.
"Inside every large program is a small program struggling to get out." (attributed to Tony Hoare)
0 Kudos
Message 7 of 12
(3,899 Views)


@tbd wrote:

Your reply refers to integers explicitly, though, aren't flattened and cast floats also byte-swapped?.


Ahh, well! Their byte order is normally also dependant on the native endianess of the CPU/OS, but LabVIEW's Flatten/Unflatten and Typecast seems to leave them alone for some strange reason.
If byte swapping during flattening is required for them, you first have to typecast them into an integer of the same size and then Flatten/Typecast to make them getting byte swapped. The same in reverse order applies to Unflatten/Typecast when going from byte stream to numeric format.

Rolf Kalbermatter
Rolf Kalbermatter
My Blog
0 Kudos
Message 8 of 12
(3,887 Views)

Thanks everyone for the useful information.

Having looked a little deeper at the header files (attached here with the dll), I don't think I will be attempting to using this method in the end for this application. Though I was only looking to use a subset of the functions, I'm not sure it will be worth the bother. Essentially I wanted to be able to load a specific file format (from Cambridge Electronic Design) in LabView, but I think I will be using another approach by exporting first in a simpler format.

Nonetheless, I did try a few things. I noted the import library wizard just completely barfs out as it doesn't seem to be getting what it needs from the .h files. Some of the structures are structures within structures, and it doesn't seem to have function declarations that the wizard is able to use. Also trying to use the call library function returns the simple message "The configured return type is not a valid return type. The type is being changed to void." I wasn't quite sure how I could just get it to return me in a simple byte array that I could parse myself, but anyway, as I said I may not try this any further. Nonetheless, the discussion is good for future use by me and others if they should find it. And just to note, I do try to do my homework, but I must admit finding things of interest in LabView help is often like trying to find the needle in a haystack given how extensive it is. Smiley Happy   ... I suppose I should've been able to find the import library wizard though (though I think my initial attempt at this stuff was prior to me upgrading my LabView version).

Thanks for all your comments!

Download All
0 Kudos
Message 9 of 12
(3,873 Views)
peter-biomed,

Looking at your .h files I don't think Son.h has any chance of being able to be called by LabVIEW without a wrapper DLL of sort.  The other header file is also going to have issues because of the nonstandard data types (TSTime, TSONTimeDate, TFileComment, etc.).

Your best bet for these would indeed be a wrapper DLL, as you will most likely not be able to get these to work in LabVIEW otherwise.  Or as you mentioned, another method.
Regards,

Jared Boothe
Staff Hardware Engineer
National Instruments
0 Kudos
Message 10 of 12
(3,820 Views)