LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Converting a string representing hex into a string of the corresponding ascii characters

So far none of the answers posted (to my eyes) have fulfilled the original poster's request.  DanB83 simply said he had a string of hex values.  He wanted a string of ASCII characters "that those values represent".  The only assumption I made, from the sample input and output data, is that each pair of chars in his input string were the hex representation of a char in the output.

 

Where in the examples did these input arrays of strings, set in LabVIEW in hex display mode, which is not the same thing, come from?

 

So, here's my submission, feel free to critique it.  I did verify it produces the output the OP used in his example:

 

 

 

parse.PNG

David Boyd
Sr. Test Engineer
Abbott Labs
(lapsed) Certified LabVIEW Developer
Message 11 of 13
(1,220 Views)

David,

 

Before things got off track, GerdW posted a very nice solution.  hex2hex.vi  Looks a bit like yours.  Here's a .PNG

 

hex2hex[1]_BD.png

 

 

 

 

0 Kudos
Message 12 of 13
(1,216 Views)

Oops, my mistake.  I did download GerdW's second posting, but missed seeing his first one.  And yes, his solution looks a bit like mine, though reforming the input string into an intermediate array of strings feels less efficient.

 

I always prefer to use Scan From String over all those type-specific string conversion primitives on the String/Number Conversion palette.  I need those error in/out terminals, and I think the C-style format specifier documents better visually.  Same goes for Format Into String.  But that's just my preference.  And the other primitives do have polymorphic terminals, so that could be advantageous in the right circumstances.

 

Sorry for the rambling - and I really don't mean to come off as grumpy or picky.

 

Dave

David Boyd
Sr. Test Engineer
Abbott Labs
(lapsed) Certified LabVIEW Developer
0 Kudos
Message 13 of 13
(1,204 Views)