06-06-2016 03:37 PM
I imported a DLL, which functioned well when it was called from C++ codes. However, a couple converted VIs did not function properly. The original functions were supposed to write a set of two hex numbers (one byte in size) to a specified address in form of two characters. Somehow after these functions were converted to VIs, they only wrote the first hex number and ignored the rest. My speculation is that since this variable is declared as a byte type, the VIs might only take the first hex number as a character (also one byte in size) and ignored the following hex number. Has anyone encountered similar issues?
Solved! Go to Solution.
06-06-2016 04:00 PM
It's impossible for the DLL import wizard to always do the right thing, because a C function prototype does not have enough information to automatically determine all the necessary information. For example, when one of the parameters to a function is a pointer to a char, LabVIEW has no way to know if that's a pointer to a single character or a string. You need to go through the imported VIs, match them up with the DLL's documentation, and make sure that every call is configured properly.
As for your specific issue, if you post the header file containing the function prototype, and any related documentation, we can help you configure a Call Library Function Node correctly for that function. It would be helpful to post the LabVIEW VI too, although if it's just the VI that the wizard created with no modifications, then that's less important.
06-06-2016 04:04 PM
If the variable is a byte, then it is only going to take one byte.
You don't have 2 hexadecimal numbers, you actually have two hexadecimal digits. You just need to conver them to a single byte. Without seeing specific code as to how you have them called out in your VI (string, U8 integer), I can only give suggestions.
If they are string characters, convert them to a number using hexadecimal string to a number. Be sure to set the data type to U8.
If you were going to send this byte out to a VISA serial port, you could use typecast to send it out as a single byte character. Since you are sending it to a dll, leaving it as a U8 might be fine. But I can't be sure without seeing your code.
06-06-2016 05:08 PM
Thank you so much for the suggestions. I understand this is an unusual case and every automation process has its limitation. And, yes, it should be two hex digits not two hex numbers. Unfortunately, I don't have the expertise to do modification at that level. Please find the vi, dll file and header file in the attachment. I really appreciate your help. I hate to go back to C programming, which will take me much longer to accomplish what Labview can do.
06-06-2016 05:09 PM
sorry, here is the attachment
06-06-2016 05:19 PM
Do you have any documentation for this function? Your description of writing 2 "hex numbers" isn't sufficient to determine what the function is supposed to do, nor how you've determined that it isn't working.
06-06-2016 06:42 PM
My apology, I should have included the description. This function fasically writes a two-digit hex number to a device thta is part of a larger device. Therefore, it needs a slave address to specify this device and an address in this device to write the value to. This manual is not explain all the detail. Here is what I have figured.
slAddr: Slave address; There are some manipulation to get this number, which probably is not important for this issue.
nAddrLen: Slave address length (size); 1 for 8 bit, 2 for 16 bit; It would be 8 bit for this device
nAddr: Address on this device the hex number is written to
nCnt: Hex number length (size); 1 for 8 bit, 2 for 16 bit; It would be 8 bit for this device
buf: Hex number input
I can only look at the response of this device to determine if it is working. From software side, it always returns 1, meaning no error. I notice the device response is very small and only correlates to the first hex digit. That is how I determine if this function is working or not. Given this situation, I will do all the testing and provide you feedback.
06-06-2016 07:58 PM
Configure the parameter in the CLFN to be a U8 integer instead of a string.
06-07-2016 02:59 PM
@RavensFan wrote:Configure the parameter in the CLFN to be a U8 integer instead of a string.
Sounds like the right solution to me.
It sounds like @grasshopperbo is trying to pass two characters, each of which is a byte. Since the function only expects one byte, it's ignoring the second byte. As RavensFan explained, the input should instead be an unsigned 8-bit integer. For the input, use a numeric control (or constant), make the radix visible, change it to hex, and enter the hex values that way. That way, the two hex digits will be combined into a single byte.
06-07-2016 04:34 PM
Thank you all so much for your help. AFter I set the input type to U9 pointer (this library function is expecting a pinter, passing value will not work), it seems the labrary function is taking the whole input value.