07-23-2009 01:14 PM
So far none of the answers posted (to my eyes) have fulfilled the original poster's request. DanB83 simply said he had a string of hex values. He wanted a string of ASCII characters "that those values represent". The only assumption I made, from the sample input and output data, is that each pair of chars in his input string were the hex representation of a char in the output.
Where in the examples did these input arrays of strings, set in LabVIEW in hex display mode, which is not the same thing, come from?
So, here's my submission, feel free to critique it. I did verify it produces the output the OP used in his example:
07-23-2009 01:20 PM
David,
Before things got off track, GerdW posted a very nice solution. hex2hex.vi Looks a bit like yours. Here's a .PNG
07-23-2009 01:58 PM
Oops, my mistake. I did download GerdW's second posting, but missed seeing his first one. And yes, his solution looks a bit like mine, though reforming the input string into an intermediate array of strings feels less efficient.
I always prefer to use Scan From String over all those type-specific string conversion primitives on the String/Number Conversion palette. I need those error in/out terminals, and I think the C-style format specifier documents better visually. Same goes for Format Into String. But that's just my preference. And the other primitives do have polymorphic terminals, so that could be advantageous in the right circumstances.
Sorry for the rambling - and I really don't mean to come off as grumpy or picky.
Dave