this is similar to case
but I can't quite get my head around that solution, or my situation is a bit different.
I have a code that reads rs232 data on a port.
Different sensor configurations result in different amounts of data coming through.
I have attached the code and the key screen shots.
The main while loop is in a stacked sequence with the other (subsequent sequence) having a delay function. If the delay is set to 0 ms, only part of the visa string is read in and I miss data. If the delay is set to 100 ms, with the current code configuration the code runs for a while and then their is a error code: -1073807253 with a message: Visa read.. Basically I think it ends up waiting too long and the data doesn't come through correctly.
If I set the delay to 80 ms, it seem about right. This is a bit of a hack.
How do I make the code adjust, or wait just long enough to read in the full string?
I feel like this data doesn't have a break at the end of it, but it might. It is hex data that is coming in.
Let me know your thoughts.
Solved! Go to Solution.
You have the port configured to stop on a termination character, so get rid of "Bytes at Port" and instead tell your VISA Read to try to read more bytes than you actually expect to read (any big number is fine, e.g. 100 bytes). Then you'll never read just part of a message, because the VISA Read won't finish until the termination character is reached.
If you're not sure there actually is a termination character at the end of the received data, then show us documentation of the data format and we'll see whether there is or not.
Can you give details on how the data is sent to you? And example would be useful. I imagine there is a set protocol. It looks like you have data being sent in an ASCII format and you explicitly enable the termination character. So I am left to assume that the data is being sent in a normal ASCII format that includes an End Of Line of some sort (Carriage Return and/or Line Feed). So based on that...
GET RID OF THE BYTES AT PORT!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
(still not enough emphasis).
The Bytes At Port just causes all kinds of weird race conditions with serial ports. Use the protocol to your advantage. If the message ends with a termination character, then just tell the VISA Read to read more characters than you ever expect in a message. If you want to go extreme, make it 4092. This works since the VISA Read will stop reading on the first of the following conditions: 1) it reads the number of bytes you specified, 2) it reads the termination character, 3) it times out.
Another note: Please learn to use a State Machine. It will make your diagrams A LOT cleaner and A LOT simpler to maintain by eliminating those sequence structures and local variables.
I found an end of line character and got rid of the bytes at port and wired in 4092. That seems to work some of the time. I am finding that other things I do on the machine (move large files, or try to zip large files) are causing the same error though. Not totally sure how to deal with these, other than don't do them while collecting data. I did have a previous working version of the code with a while loop that kept reading and concatenating the text until an end of line character was encountered, and then it exited. I might see if that is a more stable approach.
Thanks for the info. At some point I'll check out state machines. Maybe I can migrate the code to them. Some of the code has been around for 10 years so migrating architecture is fun (at least now I use stacked sequences...)
I hadn't checked before, but I see now that error -1073807253 is actually a framing error, which means that your RS-232 port received a byte which wasn't formatted correctly, e.g. it was at the wrong baud rate or had the wrong parity setting. Is that still the error you're seeing? It really shouldn't be possible for any activities you do on your computer to cause a framing error, that occurs just as the data is being received by the hardware. It would usually point to an incorrect configuration of your RS-232 port which doesn't match the configuration of your peripheral, or much less likely hardware that's not working to spec or a poor connection. Can you check that you're configuring your port to the correct settings for your peripheral?
Some equipment that I have communicated with used DOS termination characters to terminate commands (\r\n). If this is the case, then you need to have \n as the termination character, then use Trim Whitespace.vi to remove the \r.
So far I haven't seen any indication that you have even attempted to formally understand the format of commands and responses. If this is the case, stop guessing at what it's supposed to be, crack open the manual and do the legwork. You will then be well on your way to developing a rock-solid instrument interface.
I managed to find a carriage return in the data and getting rid of the get bytes solved things. The framing error occurred during the code development randomly and was very confusing, adding the get bytes seemed to resolve the framing error most of the time, which is why it was added in the first place.
The device transmits on one com port and receives on another, so it is a bit unusual in that way. I don't understand why I would get a framing error once every few days, but most of the time they wouldn't occur.
Nevertheless, it seems to be fixed now.
Turns out the description in this post
pretty much sums up my problem. Because the device is continuous spitting out data I tend to get framing issues.
Adding a bit of code that simply clears the framing errors from the errors (in a loop), makes things work so, so much better.
I would only expect a framing error to occur if the receiver received data before (or during) configuration of the serial port at the initialization of your program, because the receiving port might not be configured for the proper baud/parity/stop bits before that. Once the port is configured, the frequency or timing of your VISA Read and Write functions shouldn't be able to cause a framing error, since that occurs at a hardware level below which they operate. Possibly if you're calling VISA Open and Close repeatedly around your other operations this might be able to cause a framing error, but I'm not sure that these functions can actually enable and disable the hardware receiver such as would be required to cause a framing error. If you're not doing that, then I think any framing errors you see beyond the first one are due to actual framing errors, e.g. incorrect port configuration, noisy lines, or out-of-spec hardware.