LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

VISA Read continuous serial data drops packets...

I am streaming continuous serial data via an FTDI into Labview via VISA reads. However I am noticing that I am dropping a lot of the data. 

 

There is a chain of sync bytes in the stream alerting Labview to the beginning of the data. Right now my scheme is to read in one byte at a time in a loop and once each of these sync bytes is detected in sequence, then I VISA read in each of the data bytes one byte at a time. Reading one byte at a time allows an easy build-array to be generated.

 

After reading other forums, it seems that Labview does better reading in excess bytes (i.e. instead of reading 1 byte at a time). And that is much less likely to drop data in that case. Is this correct?

 

My next question is, how could I properly align the data of a continuous stream if I read in excess bytes all in one VISA read? Especially if I do not want to accidentally lose data of the subsequent packet by reading in too many excess bytes for the current packet.

 

Is the solution to use a termination char? (Which I am not currently doing because again I only read in one byte at a time). But if I use a termination char, am I guaranteed not to "over-read" into the next packet of data?

 

Thank you all in advance.

Download All
0 Kudos
Message 1 of 14
(5,019 Views)

I have a couple questions for you;

 

1. Do you have control of the format for the incoming packets?

2. Can you show us an example of incoming data and an example of the format youre looking for?

 

If you use a term char you will not "over-read" you will either read until termination character, until read timeout, or until youve read the max bytes you specify.



-Matt
Message 2 of 14
(4,989 Views)

We need a lot more information on the protocol of the data.  Is the data in an ASCII format?  If so, does the instrument send a termination character at the end of the message?  If binary format, how many bytes are in a message?

 

These are just opening questions.  If you have any information on what is sending the data, assuming not a microcontroller, we can look up the manual.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 3 of 14
(4,978 Views)

The problem is with your subVI, and the series of subVI's to detect the synch bytes.  I don't understand the purpose of the "proceed" boolean.  Also, I don't see where you configure your serial port with baud rate, data/stop bits, parity, and whether you use a termination character.

 

Every loop iteration you read in 3 bytes 1 at a time.  First read you are looking 226, then next read 95, then next read 37.  Suppose you have a stray byte at the beginning when you are looking for 226?  Then when you are looking for 95, you actually get 226.  Then when you look for 37, you actually get 95.  Then you go into a loop reading 1 at a time getting the 37, then all of your additional data.  You will never synchronize with this scheme.

 

I'm assuming that the only thing that tells you when a packet starts is this group of 3 bytes.  And your data after that is always the same length defined by your Number of Bytes control.  With this method, you should check Bytes at Port (one of the very few times I would use that node) and read what is in the port.  Do it in a loop and keep appending the bytes in a shift register.  Every time you add, search the shift register for the sequence of 3 bytes.  If you find them, then stop the loop pass out the remaining bytes.  Then do a read for N - how many bytes you got from the first loop.  Concatenate those together and you now have a full data packet.   Repeat all this the next time around when the big loop reiterates.  I don't think you need the post synch word, but just read the next 3 bytes and see if they match.  But what if they don't?  All you can do is throw an error that the packet was malformed.

 

A better situation is if the device own sends data when it receives a command/request to do so.  Then if you are sending ASCII data, use a termination character.  If you are sending binary.  Then start each message with 1 or 2 bytes that tells the length of data to follow.  Read that, then read however many bytes the first read told you.

Message 4 of 14
(4,975 Views)

RavensFan wrote:

With this method, you should check Bytes at Port (one of the very few times I would use that node) and read what is in the port.


I still would not use the Bytes At Port.  I would use a state machine to read the message.  Have a single state that reads 1 byte at a time until you get the first sync byte.  Then the next state reads two bytes and verifies them.  If fail, go back to the first state.  If pass, read the rest of the entire message with a single read (assuming the same number of bytes in every message).  Then back to the first state.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 5 of 14
(4,965 Views)

Thank you all for the variety of responses. Let me try to answer and address as many of each of your suggestions as I can. I should start by saying that I can stream in data fairly successfully with the current scheme. The only problem is that several packets are dropped. So this whole forum is about making the program better rather than making it work at all (because it already works, just not that well). 

 

Matt:

1. The format of the packets is binary. I don't know the data will look like, but I do know the number of bytes that I should receive.

2. The data is coming over a UART, so each of the bytes would be transmitted with a minor delay (~2 ms) between them. An example would be something like:

SYNC_0 SYNC_1 SYNC_2 DATA_0 DATA_1 ... DATA_N_1 DATA_N PSYNC_0 PSYNC_1 PSYNC_2 SYNC0 SYNC1 ...

 

3. With regard to your suggestion, if I do use a term-char, and read the max bytes, will I drop the next packet while I process the current packet, or is there sufficient buffering through the VISA protocol?

0 Kudos
Message 6 of 14
(4,900 Views)

@ce_2015 wrote:

1. The format of the packets is binary. I don't know the data will look like, but I do know the number of bytes that I should receive.

3. With regard to your suggestion, if I do use a term-char, and read the max bytes, will I drop the next packet while I process the current packet, or is there sufficient buffering through the VISA protocol?


Termination Characters do not work well with Binary formats because they can cause you read to end prematurely.  So since you have a binary format, make sure the Termination Character is turned off from the Configure Serial Port.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 7 of 14
(4,893 Views)

I should also add that in the code that calls these modules, there is a "Wait until next multiple of ms" function. This may be the source of my troubles as I do not understand this module very well other than it is necessary to advance time? Anyways I've noticed that I drop more packets the smaller ms number that I use. 

 

How do I determine what ms to use? Or is there a better way to advance time in the loop that calls the VISA read?

 

Thanks!

0 Kudos
Message 8 of 14
(4,883 Views)

Attach the lastest version of all your subVI's as well.

0 Kudos
Message 9 of 14
(4,876 Views)

Here they are.

Download All
0 Kudos
Message 10 of 14
(4,872 Views)