LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Calling DLL pointers c++. Input parameter configuration

From c++ I defined the function as external 

extern XAPI int FCCRead(int channelId, char *buf, int nbytes, FCCDataType *);

// Calling it in c++ 
int n;
char line[FCC_MAX_LINE_LENGTH+1]; 
FCCDataType dataType;

n = FCCRead(mChannelId, line, n, &dataType);

I'd like to call the same dll function from LabVIEW but I have difficuly setting the buf, nbytes and FCCDataType properties in Call Library Function properties

 

I have typedefed FCCData type in LabVIEW. It's an enum only. 

 

What parapeters should be set for buf, nbytes and FCCDataType?

 

 

b.jpg

 

 

0 Kudos
Message 1 of 6
(2,796 Views)

You don't show the typedef for FCCDataType so there is no way to guess what it should be configured to.

 

Basically the configurations for the other 3 parameters is correct except that you will want to set the numeric type of the array to an (unsigned) 8-bit integer to match the char datatype. The Adapt to Type for FCCDataType is most likely wrong as that will pass the LabVIEW datatype to the DLL and for simple datatypes you better configure the explicit datatype and for comples datatypes if they contain anything like strings and/or arrays the LabVIEW datatype does NOT match what a C program would expect.

You should not forget to preallocate the Buffer array on the diagram with an Initialize Array function to the size (+ 1) you also pass in in the nBytes parameter or alternatively set the minimum size in the parameter configuration to be equal to nBytes if the +1 in the buffer allocation isn't really necessary. But I can't say without reading the documentation of that function if that would be ok.

 

And your code example has an error:

extern XAPI int FCCRead(int channelId, char *buf, int nbytes, FCCDataType *);

// Calling it in c++ 
int n = FCC_MAX_LINE_LENGTH;
char line[FCC_MAX_LINE_LENGTH+1]; 
FCCDataType dataType;

n = FCCRead(mChannelId, line, n, &dataType);

 

Rolf Kalbermatter
My Blog
Message 2 of 6
(2,786 Views)

The code is wrong in here but not in reality.  I didnt insert the while loop here. 

extern XAPI int FCCRead(int channelId, char *buf, int nbytes, FCCDataType *);

// Calling it in c++ 
int n;
char line[FCC_MAX_LINE_LENGTH+1]; 
FCCDataType dataType;
while ((n = FCCNRead(mChannelId)) > 0)
    {
n = FCCRead(mChannelId, line, n, &dataType);
...
}

 

FCCDataType is an enum 

typedef enum {
FCC_STATE_IDLE, 
FCC_DATA_BLOCK,
FCC_DATA_TUNNEL,
.... 
} FCCDataType;

I created this in LabVIEW using TypeDef in a Control. 

 

 

0 Kudos
Message 3 of 6
(2,765 Views)

Honestly, for integrating LabVIEW and C++ code, sometimes its just easier to write a LabVIEW-friendly wrapper dll to make the calls in your other DLL.

These days, I find I spend less time monkeying with data types if I worry about the struct typedefs entirely in C/C++.  Everything that gets passed into my wrapper dlls from labview is either a string, an integer, or an array of unsigned 8-bit ints (that last one is particularly handy).

You can still use the LV-typedef for FCC state on the UI-side, just have it pass a string to your LV-friendly wrapper dll, where the correct typdef is set based on the string it received.  It's an additional hop for the data, and it does waste some cycles, but it may save you a bunch of development time.

0 Kudos
Message 4 of 6
(2,759 Views)

@aan928 wrote:

 

 

FCCDataType is an enum 

typedef enum {
FCC_STATE_IDLE, 
FCC_DATA_BLOCK,
FCC_DATA_TUNNEL,
.... 
} FCCDataType;

I created this in LabVIEW using TypeDef in a Control. 


While I fully advocate the creation of an intermediate DLL to ease the adaption between LabVIEW and a C DLL, this isn't quite necessary in this case.

 

Basically an enum is an integer. There is a chance that the Adapt to Type works too for this case but I find it more clear to use the actual basic integer datatype in such cases. It should work if you configure the parameter to be a Numeric, unsigned 32-bit integer, passed as Pointer to Value (because of the reference in the prototype). Even more correct would be to configure it as the unsigned bit size that is big enough to contain the maximum enum value (so if the maximum numeric enum value is not bigger than 255) make it an 8-bit unsigned int and if it is <= 65535 make it a 16-bit unsigned int.

 

At least Visual C tends to pack enums in an as small integer as possible (but then uses alignment rules when an enum is inside a struct that assume it to be an int32_t). The exact implementation size on machine level really is compiler specific and the C standard doesn't say anything about this other than that the individual integer constants (the enum values) shall be an integer constant, but the enum type itself shall be compatible with char, a signed integer type, or an unsigned integer type. Note that this only holds true for integer types up to 32-bit. There is no provision for support of 64 bit enums in the C standard as the 64-bit integer is strictly speaking an extension and not a basic integer type. While some C compilers might support 64-bit enums this is asking for trouble if that code ever gets compiled by a different C compiler.

Rolf Kalbermatter
My Blog
Message 5 of 6
(2,744 Views)

Thank you for your great explanation and helpful comments. I will test the code at the end of this week after some off-site dev.

0 Kudos
Message 6 of 6
(2,728 Views)