From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

RS-232 Data Dropping

Solved!
Go to solution

Hello,

 

I am trying to collect data from an oxygen sensor that sends data through RS-232.  My computer is connected to the sensor using a USB-to-RS232 cable, and I am using the attached subVI.  The VI sampling period and sensor sampling period are the same at 200 ms.  However, at times I receive a glitch of some sort that makes the incoming data drop to 0, seen in the plot below.  I probed the "bytes from port" and I notice that it is consistently 11 and it varies wildly (from 7 to 13) at the same time that the data drops to 0. How can I programmatically ensure this doesn't happen? 

 

Thing I have tried:

1. Changing the number of bytes read by "Serial Read" to a constant value at 11.  Same error occurs

2. Changing baud rates and sampling rates of VI and Controller.  When slowing the sampling period from 200 ms to 500 ms, error still occurs. 

3.  I use the program "termite" to view the raw RS-232 data being sent and it shows as the image with green text below. I don't believe it is a hardware issue because the data is being sent correctly when viewed from this program (the "drops to zero" glitch occurs in my VI)

 image.png

image.png

 

Thank you

0 Kudos
Message 1 of 3
(2,640 Views)
Solution
Accepted by LeroiVinncent

DO NOT USE THE BYTES AT PORT!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! (Never enough emphasis)

 

Your instrument has a protocol.  USE THE PROTOCOL.    Based on the data screenshot, it looks like the message format is an ASCII format with a 0x02 to start the message and a 0x03 to end the message.  So let's use that!  Enable the termination character and set the termination character to be 0x03.  Now just tell the VISA Read to read more than you ever expect to get in a message (I like to use 50).

 

You do not need a Timed Loop or even a wait in your loop.  The VISA Read will limit your loop rate.

 

And finally, you need to add a way to stop your loop and close your port afterwards.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 2 of 3
(2,628 Views)

@crossrulz wrote:

DO NOT USE THE BYTES AT PORT!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! (Never enough emphasis)

 

Your instrument has a protocol.  USE THE PROTOCOL.    Based on the data screenshot, it looks like the message format is an ASCII format with a 0x02 to start the message and a 0x03 to end the message.  So let's use that!  Enable the termination character and set the termination character to be 0x03.  Now just tell the VISA Read to read more than you ever expect to get in a message (I like to use 50).

 

You do not need a Timed Loop or even a wait in your loop.  The VISA Read will limit your loop rate.

 

And finally, you need to add a way to stop your loop and close your port afterwards.


That handles only half of the "protocol".

 

The removed bytes should be checked to ensure it is a Hex 02 before the data is converted. Otherwise it could be incomplete or corrupted 'Frame". Smiley Happy

 

The STX and EOT are often used when there is a binary protocol involved that may include text.

 

For what its worth...

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 3 of 3
(2,608 Views)