LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Passing two hex numbers as a byte variable in dll calls

Solved!
Go to solution

I imported a DLL, which functioned well when it was called from C++ codes.  However, a couple converted VIs did not function properly.  The original functions were supposed to write a set of two hex numbers (one byte in size) to a specified address in form of two characters.  Somehow after these functions were converted to VIs, they only wrote the first hex number and ignored the rest.  My speculation is that since this variable is declared as a byte type, the VIs might only take the first hex number as a character (also one byte in size) and ignored the following hex number.  Has anyone encountered similar issues?

0 Kudos
Message 1 of 12
(3,683 Views)

It's impossible for the DLL import wizard to always do the right thing, because a C function prototype does not have enough information to automatically determine all the necessary information. For example, when one of the parameters to a function is a pointer to a char, LabVIEW has no way to know if that's a pointer to a single character or a string. You need to go through the imported VIs, match them up with the DLL's documentation, and make sure that every call is configured properly.

 

As for your specific issue, if you post the header file containing the function prototype, and any related documentation, we can help you configure a Call Library Function Node correctly for that function. It would be helpful to post the LabVIEW VI too, although if it's just the VI that the wizard created with no modifications, then that's less important.

0 Kudos
Message 2 of 12
(3,674 Views)

If the variable is a byte, then it is only going to take one byte.

 

You don't have 2 hexadecimal numbers, you actually have two hexadecimal digits.  You just need to conver them to a single byte.  Without seeing specific code as to how you have them called out in your VI (string, U8 integer), I can only give suggestions.

 

If they are string characters, convert them to a number using hexadecimal string to a number.  Be sure to set the data type to U8.

 

If you were going to send this byte out to a VISA serial port, you could use typecast to send it out as a single byte character.  Since you are sending it to a dll, leaving it as a U8 might be fine.  But I can't be sure without seeing your code.

0 Kudos
Message 3 of 12
(3,671 Views)

Thank you so much for the suggestions.  I understand this is an unusual case and every automation process has its limitation.  And, yes, it should be two hex digits not two hex numbers.  Unfortunately, I don't have the expertise to do modification at that level.  Please find the vi, dll file and header file in the attachment.  I really appreciate your help.  I hate to go back to C programming, which will take me much longer to accomplish what Labview can do.

0 Kudos
Message 4 of 12
(3,665 Views)

sorry, here is the attachment

0 Kudos
Message 5 of 12
(3,663 Views)

Do you have any documentation for this function? Your description of writing 2 "hex numbers" isn't sufficient to determine what the function is supposed to do, nor how you've determined that it isn't working.

0 Kudos
Message 6 of 12
(3,658 Views)

My apology,  I should have included the description.  This function fasically writes a two-digit hex number to a device thta is part of a larger device.  Therefore, it needs a slave address to specify this device and an address in this device to write the value to.  This manual is not explain all the detail.  Here is what I have figured.

 

slAddr: Slave address; There are some manipulation to get this number, which probably is not important for this issue.

nAddrLen: Slave address length (size); 1 for 8 bit, 2 for 16 bit; It would be 8 bit for this device

nAddr: Address on this device the hex number is written to

nCnt: Hex number length (size); 1 for 8 bit, 2 for 16 bit; It would be 8 bit for this device

buf: Hex number input

 

I can only look at the response of this device to determine if it is working.  From software side, it always returns 1, meaning no error.  I notice the device response is very small and only correlates to the first hex digit.  That is how I determine if this function is working or not.  Given this situation, I will do all the testing and provide you feedback.

 

0 Kudos
Message 7 of 12
(3,645 Views)
Solution
Accepted by topic author grasshopperbo

Configure the parameter in the CLFN to be a U8 integer instead of a string.

0 Kudos
Message 8 of 12
(3,632 Views)
Solution
Accepted by topic author grasshopperbo

@RavensFan wrote:

Configure the parameter in the CLFN to be a U8 integer instead of a string.


Sounds like the right solution to me.

 

It sounds like @grasshopperbo is trying to pass two characters, each of which is a byte. Since the function only expects one byte, it's ignoring the second byte. As RavensFan explained, the input should instead be an unsigned 8-bit integer. For the input, use a numeric control (or constant), make the radix visible, change it to hex, and enter the hex values that way. That way, the two hex digits will be combined into a single byte.

0 Kudos
Message 9 of 12
(3,605 Views)

Thank you all so much for your help.  AFter I set the input type to U9 pointer (this library function is expecting a pinter, passing value will not work), it seems the labrary function is taking the whole input value.  

0 Kudos
Message 10 of 12
(3,597 Views)