From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
01-14-2019 01:54 PM - edited 01-14-2019 02:01 PM
Hello,
I am trying to collect data from an oxygen sensor that sends data through RS-232. My computer is connected to the sensor using a USB-to-RS232 cable, and I am using the attached subVI. The VI sampling period and sensor sampling period are the same at 200 ms. However, at times I receive a glitch of some sort that makes the incoming data drop to 0, seen in the plot below. I probed the "bytes from port" and I notice that it is consistently 11 and it varies wildly (from 7 to 13) at the same time that the data drops to 0. How can I programmatically ensure this doesn't happen?
Thing I have tried:
1. Changing the number of bytes read by "Serial Read" to a constant value at 11. Same error occurs
2. Changing baud rates and sampling rates of VI and Controller. When slowing the sampling period from 200 ms to 500 ms, error still occurs.
3. I use the program "termite" to view the raw RS-232 data being sent and it shows as the image with green text below. I don't believe it is a hardware issue because the data is being sent correctly when viewed from this program (the "drops to zero" glitch occurs in my VI)
Thank you
Solved! Go to Solution.
01-14-2019 02:22 PM - edited 01-14-2019 02:23 PM
DO NOT USE THE BYTES AT PORT!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! (Never enough emphasis)
Your instrument has a protocol. USE THE PROTOCOL. Based on the data screenshot, it looks like the message format is an ASCII format with a 0x02 to start the message and a 0x03 to end the message. So let's use that! Enable the termination character and set the termination character to be 0x03. Now just tell the VISA Read to read more than you ever expect to get in a message (I like to use 50).
You do not need a Timed Loop or even a wait in your loop. The VISA Read will limit your loop rate.
And finally, you need to add a way to stop your loop and close your port afterwards.
01-14-2019 02:44 PM
@crossrulz wrote:
DO NOT USE THE BYTES AT PORT!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! (Never enough emphasis)
Your instrument has a protocol. USE THE PROTOCOL. Based on the data screenshot, it looks like the message format is an ASCII format with a 0x02 to start the message and a 0x03 to end the message. So let's use that! Enable the termination character and set the termination character to be 0x03. Now just tell the VISA Read to read more than you ever expect to get in a message (I like to use 50).
You do not need a Timed Loop or even a wait in your loop. The VISA Read will limit your loop rate.
And finally, you need to add a way to stop your loop and close your port afterwards.
That handles only half of the "protocol".
The removed bytes should be checked to ensure it is a Hex 02 before the data is converted. Otherwise it could be incomplete or corrupted 'Frame".
The STX and EOT are often used when there is a binary protocol involved that may include text.
For what its worth...
Ben