I've searched this topic, and haven't found anything meaningful online (but would be happy to read if anyone has a resource). I'm dealing with an issue right now where an instrument I'm using is pretty consistently dropping the first 5-20 bytes in a 700 byte transmission. The serial driver I'm using for a read is here:
I've checked and double checked the baud, timeout, termchar, flow control etc and it all checks out. If it weren't, I wouldn't expect to get any data at all, but the drop of bytes only occurs about 47% of the time.
How do you all do robust serial comms?
It is strange that it is at the beginning where you are dropping data. Could it be that the buffer is filling up too quickly so the oldest data is lost?
Are you running this read VI immediately after writing something to the device?
Maybe move your "milliseconds to wait" to the end of the loop so that it reads data right away.
That's an interesting idea. The device is at 9600baud, so the transfer time is almost 700ms, which may lend some credence to that theory. I have confirmed that the device itself does not drop data using a logic analyzer. The wait before the read is there so that the data has time to come into the port. I'm exploring another option right now that doesn't involve error clearing.
As it has been said many many many times on this forum
If the device you are communicating with has a termination character (always ends with the same character) USE IT!
Use the VISA serial config to enable the "term char", and set it to the proper character.
Then set the VISA Read to read way more bytes than you expect to receive.
Now the VISA read will read until it sees the termination character or times out.
BTW: your "milliseconds to wait" is what is probably causing you to lose that first chunk of data.