LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Serial Port packet detection based on Interbyte delay

I want to develop an application that communicates serially and detects packets.If the delay between 2 bytes is greater than 3.5 msec the bytes received before this delay should be considered as one packet.The bytes received after the delay is a new packet.

 

0 Kudos
Message 1 of 7
(4,741 Views)

Hmm... that is a tough one.  I think the only way to reliably do this would be with LabVIEW Real-Time.  I don't think NI-VISA can provide timestamps with each byte.  So you'd have to setup a really fast loop that read each byte individually and calcualted the time difference... which I think would be very unreliable in a Windows OS.

 

However:  Is this really necessary?  Does the application really require it?  Or could you just throw all the data together into the buffer and then read and parse the data afterwards?  Usually a serial interface will indicate what packets should be merged using some kind of delimiter that marks the end of a "packet" (e.g. an LF or CR byte).

http://www.medicollector.com
0 Kudos
Message 2 of 7
(4,718 Views)

I have done something similar looking for the "break" in serial communication by adjusting the timeout of the port to some small timeout and reading the "number of bytes at port" +1 byte and checking for timeout. If the port timed out a break had occurred and the data received up to that point was a packet and queued up. Then quickly looped around for the next packet. I still checked the data on the other end of the queue to verify full packets where received and some times had to adjust the packets at the beginning until my code got synced up.

 

BUT my timeout was on the order of 20ms and not 3.5ms and the total length of the packets was on the order of 200ms. So as stated by josborne, I don't think your case will work with Windows.

 

Omar
0 Kudos
Message 3 of 7
(4,696 Views)

Thanks...

Can you share you code? Will 10 ms timeout  work?

0 Kudos
Message 4 of 7
(4,681 Views)

This really would be best handled in either hardware (LabVIEW FPGA might be applicable) or possibly LabVIEW Real-Time. The Windows O/S's do not have the determinism to do this reliably, you would be at the mercy of any higher level interrupt delaying your code longer than the 3.5 mS. As the earlier poster asks, what are you trying to do, if you can change the inter packet delay time could you instead put a termination character instead and then parse blocks of received data for the partition characters?

Putnam
Certified LabVIEW Developer

Senior Test Engineer North Shore Technology, Inc.
Currently using LV 2012-LabVIEW 2018, RT8.5


LabVIEW Champion



0 Kudos
Message 5 of 7
(4,666 Views)

I think that you are asking for problems using this approach. A quick interruption in your loop due to Windows switching threads and such could easily cause your program to miss an interbyte delay and mash adjacent packets together.

 

If you have control over the protocol used to communicate, I would try to either use hardware handshaking which would eliminate the need for you to split the messages individually or use some form of packet delimiter to show breaks between packets.

 

Depending on the complexity of the packets coming across, a simple CRLF could delimit individual packets.

 

If you expect that there might be CRLF packets in the actual packet payload, you could base64 encode the message then append a nonprintable character delimiter on the end of it before sending.

On the receiving side, you would look for the delimiter, then base64 decode the characters before it.

Greg Sussman
Sr Business Manager A/D/G BU
0 Kudos
Message 6 of 7
(4,664 Views)

Please take the advise offered above and try something else to break out your packets like a tremination character.

 

I see my time out was much longer at 50ms. This worked I my case. I dought this will work for 3.5 ms

Here is the VI

USE AT YOUR OWN RISK

 

Omar
0 Kudos
Message 7 of 7
(4,653 Views)