How do I choose a timeout value for a TCP read function?
I'm receiving data from a server at a rate which varies between 40 and 140 Hz (normally 120 Hz), there is no handshaking and at any set transmission rate it appears that the network load causes the rate to fluctuate slightly.
All I have been able to do so far is plot a frame number that is received in my data and fiddle with the buffer mode and timeout until I get the least dropouts. At 120Hz 5ms works best with a standard buffer, surprisingly 8ms is poorer.
I am using a producer consumer loop with a queue and some interpolation to make up for the loss of data.
This is very hit and miss so I'm sure there's a proper / better way to do this!