What about the timeout? I can't remember what the default value is (no LV on tis PC).
I will try to describe what I mean...
When you do a serial read, you need to find out the number of bytes available at the buffer to be read. If for some reason, your value to be read is higher than the actual number of bytes available to be read, then the vi simply waits until one of the two conditions: 1. the amound of data reaches the number of bytes to be read, or 2. it times-out and completes the read with whatever data was available in the buffer.
However, let's find out how you implemented your vi to find out how to reduce / eliminate unwanted delays.
1. Write to the serial port (send command, etc). 2. Wait 200 ms 3. Either wait for timeout or have data at the serial buffer. 4. Read until serial buffer is empty.
I assume there are no delays inside the True case, simply a read serial port and an append to string. And you use the value from "Bytes at Port" to set how many bytes to read. Seems okay.
Have you ever noticed loss of data? ie: not all messages are received? It is true that you are reading at 57600. I am used to much slower speeds, such as 9600.
I don't see why you would have noticeable delays as compared with HyperTerminal. The only concern I would have is if some of the messages are lost due to not having a delay.
The thing to consider is that Hyperterminal (or other such sw) will read the port continuously and never stop. In LV, you read until -- something --... then stop reading to process the data (make decision, etc.). And there is a bit of overhead because of it. It is this overhead that adds up with every iteration of the loop. Maybe someone can correct me, but I seem to remember that it is approx 4ms per action. So every iteration of your loop probably takes approx 20ms. Which is not that much... but it adds up quickly.
In terms of delays, what is the difference? Is it consistent? Does it slow down?
To trap the delay, you may need to be clever. The "Highlight Execution" is not your friend in this case 😉
However, if for some reason or other, the number of bytes reported by the buffer is not accurate, the "timeout" of the VISA Read may kick in and cause excessive delay. The only way to confirm this would be to compare the two values (number from "Bytes at Port" and actual bytes read).
I am currently not experiencing any loss of data. The program seems to be working quite well, except for the delay. The delay is consistent and visible on the scope when a write is taking place. In my program the write and read sections of the "serial read.vi" is split because I don't always need to read after I write.