12-27-2011 01:39 AM
I have a string of arbitrary length - say, 1AB1C0 - in normal format. I would like to have a function which takes that string in, and outputs the same exact characters, but in hex format. So the input will read 1AB1C0 in normal string format, and the output will read 1AB1C0 in hex string format.
I am doing this because I have found the end device works better if I send the bytes as a hex string vs. as an array of U8. Furthermore, I manipulate a lot of byte streams in my program, and I find the string parsing tools less cumbersome to use vs. the byte array parsing tools - however, they only work on the strings as they appear in normal mode. So I have reasons in different parts of my program to have the string in normal vs. hex formats; but I cannot figure out any remotely elegant way to cast from one to the other.
Solved! Go to Solution.
12-27-2011 01:52 AM
Hi,
You can just right click on the string indicator and select 'Hex Display' or you can use a property node of the string to change it programmatically 🙂 see this.
12-27-2011 03:47 AM - edited 12-27-2011 03:47 AM
@CraigRem wrote:
So the input will read 1AB1C0 in normal string format, and the output will read 1AB1C0 in hex string format.
Well, you need to scan the hex formatted string two bytes at a time to an U8 numeric and then cast it back to a string. There are many ways to do that, here are two possibilities. (Mkae sure to use the correct representations as labeled).
(I still don't quite understand your reasoning why you need this).
12-27-2011 10:30 AM
I was missing the U16 to U8 downversion. Thanks!.
03-14-2012 11:07 AM - edited 03-14-2012 11:09 AM
thanks alot