LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

UDP Packet Size vs Latency

I'm doing some experiments with UDP and compact RIOs, where I'm characterizing average/probable latency of UDP packet steams on a network with different amounts of artificial congestion (using iperf3). I'm running several tests with different sized payloads (20, 50, 100, 200, 500, and 1000 bytes).  My question is:

 

From 20 to 100 bytes, I saw average latency increase as a function of packet size, as well as a function of link congestion (as expected). However, at a UDP payload 200 bytes and above, the latency of the UDP packets arrival suddenly decreases sharply. Is there an expected reason for this? I haven't been able to find anything about this behavior via google or these forums.

0 Kudos
Message 1 of 2
(2,332 Views)

This is really interesting, so thanks for bringing it up! I did some internal research and found that if the data size is larger than a certain amount, the UDP packet will split into more than one packet. In the document I read, this happened after 1473 bytes, but my guess is that this is happening around 200 bytes for your specific setup.

 

Francine P.
Applications Engineering
National Instruments
0 Kudos
Message 2 of 2
(2,277 Views)