Found this baffling issue to me:
UUT Readback: Failed
Comparison Type: EQ (==)
Module Time: 0.0414499
I did a numeric limit test on an UUT readback...then realized even though the HEX number is right, I always fail...what the?
So what I do is, I read information from another part of the UUT at 4 memory locationsand it spits out that in 8bit chucks (Let's call these 4 datas: A, B, C,D). My Numeric Test is comparing all the information at once in a 32bit word from yet another memory location (data Z), so I end up converting each UUT memory location readback to a HEX string, then doing a "+" operation so it will be appended instead of just doing a normal arithematic addition.
At the end I get "99"+"9f"+"ff"+"df" = "999fffdf" or A+B+C+D = "999fffdf" , then I do a val("0x"+ "999fffdf") and I get a Hex Number out. However, I have noticed in my investigation of this failure that the converted A+B+C+D is always an unsigned number. So when TS does the compare to data readback from Z, even though the HEX are the same, the test always fails. I have tried to set the values to numeric format unsigned and it does nothing, always a failure.
Is there a way for me to convert HEX Strings to Hex Number while keeping the sign?
I think you may need to try the following:
Locals.TestNumber = Int64(Val("0x"+ "999fffdf"))
Just be sure TestNumber is represented as a Signed 64-bit integer.
Good Morning Jared,
I should have mentioned I am on Teststand 4.2
I am guess the Int declaration is on TS 2010 or later? I do not find any support to this call in 4.2
After looking into this some more, I determined how to configure TestStand to successfully compare the hexadecimal values.
Under the Data Source Expression of the Numeric Limit Test, you must wrap the returned value in the Val() function as so:
This is necessary if you are returning an integer from LabVIEW.
The problem is that TestStand number variables are really floating point numbers so that when you convert "0xFFFFFFFF" for example, it doesn't know you want a signed value. It assumes you want unsigned, so converts it to the unsigned equivalent.
I do not think Val() will help you in this case, I think Jared was correctly getting a signed-integer becase that is the data type for his VI parameter.
I think the easiest thing for you to do is to write a code module to convert the string to a signed-integer (or to convert the unsigned value to its signed integer equivalent) rather than doing so directly in teststand. If you return the signed-integer value back to teststand as a signed-integer parameter of a code module, teststand will correctly store the signed value in a TestStand Number variable. You can write a pretty simple VI or C or C# function which does this and call it to do the string conversion or unsigned to signed conversion.
You might want to write an idea exchange post to recommend TestStand add a feature to support converting hex strings into signed-integers directly.
Hope this helps,
I know that this topic was started many years ago, but I had the same issue nowadays and was not able to find any solution.
Finally I was able to figure out a solution works to me.
Therefore I present my solution here maybe others will find this topic in the future and it can help.
My task was to covert hex string which store the value in 16-bit signed format.
The value stored in string is -265 (0xFEF8).
Step.Result.Numeric = Val("0x" + "FEF8" ) ,
Step.Result.Numeric > 0x7FFF ? ( Float64(0xFFFFFFFFFFFF0000i64 + Int64(Step.Result.Numeric)) ) : Step.Result.Numeric
Here is my solution in a single line: Convert 2Byte Hex to Integer
256*Locals.Byte_hi + Locals.Byte_lo - (Locals.Byte_hi > 0x7f ? 0x10000 : 0)
By the way "-265 is not "0xFEF8"