05-10-2018 12:49 PM
I am working with a custom control board that I communicating over RS232. It has been used in my company for some time, so I have some LabVIEW code to communicate with it. I am not very knowledgeable about serial communications, but it seems like we are not reading it in the most efficient way. We read 5 bytes at a time in a loop until we hit a timeout.
We have a command set of ~40 ASCII commands. The responses vary in length quite a bit. Many of them end with "\r>\r>" (slash codes display), but not all of them. Some of the commands put the board into a state, like waiting for a trigger, or waiting for USB data to be sent. Those responses end with "\r>\r>[MNGR/EXPOSE]\r" and "\r>\r[MNGR/RECEIVING]\r" respectively.
When the return messages are very long, it can take up to 800ms to get the entire response back, so we set the timeout to something like 1000ms, but then the read function will add an additional second because it waits to timeout. Pictures of the VIs shown below, please let me know if you need any extra information about the device.
Solved! Go to Solution.
05-10-2018 01:14 PM - edited 05-10-2018 01:15 PM
Initialize
Write
Read
05-10-2018 01:32 PM
Crossrulz,
The controller board comes from a parent company who uses this board and our devices, so I doubt my request for a change to the protocol would get anywhere, but I can try. By the way, when I look at the carriage return constant in hex view it is "0D", and linefeed is "0A". Is LabVIEW's definition different than standard?
Assuming I can't get the protocol changed, is waiting for timeouts the best (or only?) way to make sure I have the entire response?
05-10-2018 08:27 PM
@Gregory wrote:
By the way, when I look at the carriage return constant in hex view it is "0D", and linefeed is "0A". Is LabVIEW's definition different than standard?
No, that was my mistake. I was hurrying to get out the door and stated it backwards. I'll probably mull over this in my sleep and come up with something better. Though, that might require knowing the entire protocol.
05-11-2018 07:29 AM
I would make the Termination Character 0x0A, Line Feed, as that is not only the more common "standard", but it is one you are using yourself, since you append the "End-of-Line, CR/LF" string to your Command Line, and that is 0x0D, 0x0A. But you need to "experiment" with your device and see if that works ...
Have you tried reading 1000 bytes instead of 5, and letting a Termination Character end the string for you? Are there instances where this doesn't work? [I haven't read the Protocol, but crossrulz seems to indicate that this might not be a feasible idea ...].
Bob Schor
05-11-2018 10:24 AM
I know this is heresy, and I will be scolded and yelled at, but may I suggest the using the Bytes at Port. I know that for current generation LabVIEW Bytes at Port is not accepted, is not accepted for next generation LabVIEW, maybe it will be accepted for future generations of LabVIEW, such as voyager LabVIEW or deep space 9 LabVIEW, but maybe I am just dreaming that we can all get along in the future. Anyway, below is an idea to try:
It works like this, adjust as needed.
Yes I know there might be problems with this VI, you are welcome to improve it, however, it has worked for me without any hiccups for devices that do NOT have terminators characters, or do not follow normal conventions.
2014 version attached.
mcduff
05-11-2018 10:38 AM - edited 05-11-2018 10:41 AM
There's nothing wrong with using "bytes at port" if you really just want to know how many bytes have been received, but what gets pounced on here is when "bytes at port" is fed directly into a VISA Read and it's just assumed that the whole message has been received, instead of using a terminal character to wait until it definitely has been.
If there's no termination character, fixed-width message, or some other way of knowing for sure that the entire response has been received, then sometimes there's no choice but to do it your way, receiving byte-by-byte until a complete message can be identified.
05-11-2018 10:42 AM
@arteitle
Correct, that is why the VI I posted needs consecutive reads where the number of bytes does not change for the loop to exit. However, it will miss the message if it takes longer than 50 ms for the number of bytes to change, in the above configuration.
mcduff
05-11-2018 10:59 AM
Hi all, thank you all for the suggestions.
First a question: is it standard practice to put a small wait after a Write function like in mcduff's example? Or does it just depend on the device?
And an update: I have heard back today from the firmware programmer that the termination character is indeed ">". If I set the termination character to 0x3E, everything appears to work pretty smoothly.
For some commands, the board will acknowledge the command, go into a certain state, and acknowledge when it leaves that state. I know ahead of time which commands will require multiple reads though, so that is ok...
I think the reason I was getting "\r>\r>" earlier, is because I don't need CRLF when I write a command, just CR.
05-11-2018 11:05 AM
First a question: is it standard practice to put a small wait after a Write function like in mcduff's example? Or does it just depend on the device?
For the VI I posted it is best to have a delay after the write. Think causality. Serial device get commands, process command, then sends an output, this cannot happen instantaneously. If you have a read with a timeout then a wait is not necessary, I am checking a number of bytes response in the VI above, no timeout is possible for that command.
mcduff