LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Error when converting binary string

Hi there,

 

In the attached screen shot, the VI works normally with a manually entered binary string, and converts it to ASCII text without any issue. However, when using the myDAQ to receive data from a transmitter, I get this issue. How can I fix this and get my ASCII text output correctly?

 

Regards,

Christopher

0 Kudos
Message 1 of 11
(2,516 Views)

Hi lad,

 

it really would help to make the display style indicator visible in ALL string controls, indicators and constants!

(Or to attach your VI with some useful data set to default…)

 

The error message is quite clear: what have you tried to solve the issue so far?

 

(I guess there is a wild mix of "binary string" and "ASCII string" in your VI. Display style really matters for strings…)

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 2 of 11
(2,512 Views)

Hi there,

 

I've made a system which functions as a receiver connected to a myDAQ transmitter. It receives a repeating binary string of 0s and 1s, for example in the string 011111101100011000010110010011101001011011001110000001000111111011000110000101100100111010010110110011100000010001111110 we have the start bit which is always detected first, ~, which has a binary representation of 01111110. So after this, we have a match pattern function which reads after the start bit, so all that is fine. However, once we have the start bit, the conversion does not work. I have attached the VI image.

0 Kudos
Message 3 of 11
(2,501 Views)

Hi lad,

 

why do you start a new thread instead of answering my questions and following the suggestions?

Still no display style indicators for all strings…

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 4 of 11
(2,497 Views)

Oh okay sorry. I think I've narrowed down the issue to a small section in the VI, and I have no idea what the issue with this is. It's attached, and everytime I run it I get an error, but it should be receiving what it wants.

0 Kudos
Message 5 of 11
(2,493 Views)

Hi lad,

 

would you please attach a downconverted version of your VI? (LV2017 would be fine.)

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 6 of 11
(2,487 Views)

It is attached.

0 Kudos
Message 7 of 11
(2,484 Views)

Hi lad,

 

just handle the error out of ScanFromString and you're fine…

check.png

 

Or simplify your code:

check.png

 

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 8 of 11
(2,475 Views)

Thank you very much. But now, when I implement this in the program which reads 1000 samples of whatever word you input (in this case, it's chris) it doesn't read the word properly. The program you simplified and created works perfectly for a manually entered string of the correct length, but now, with my 1000 sample transmitter, I just end up getting garbage. The screenshot is attached, with the output I receive. 

0 Kudos
Message 9 of 11
(2,463 Views)

Hi lad,

 

for the next time: please make the "display style" indicator visible for ALL string controls, indicators and constants!

(Right-click the string -> visible items-> display style. For each of them!)

 

Where does that "Binary output" come from?

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 10 of 11
(2,432 Views)