Hi,
I developed an application under LabView 6.1 and on Win98 to read serial data and plot the data on a graph. The data is expected in chunks of 9 bytes, and I used the serial port read vi and told it to expect 9 bytes.
Under win98 it worked fine i.e. it waited until 9 bytes were available and then produced them as a string.
Under Win2K it gives the number of bytes it has, up to a maximum of 9 i.e. if it only has 3 bytes then it gives 3 bytes.
I changed the application in Win2k to loop until at least 9 bytes are available at the port and then to read 9 bytes. This sort of works but locks every so often, when the bytes at serial port vi (which controls the loop) returns a value less than 9 and sticks
with this value even though I am absolutely certain that more bytes have been read on the port.
While I would like to know why there is a difference, I would prefer a solution. Does anyone know what's going on ?