LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How to calculate delay between a client and server using TCP on Labview

Hi,

I'm working on LabView 8.2 version and have developed a code which establishes a client and a server on two different computers via LAN using TCP. I've also attached the client and server vis for reference of all.

Now, I want to measure the delay (which should be in milliseconds) between the client and server computers. Is it possible to do it in LabView using these vis.

 

Looking forward to hear from you soon,

Thanks & Regards,

Download All
0 Kudos
Message 1 of 6
(3,886 Views)

Assuming you want the time from before the client vi executes to the time after the server vi executes the following should work. Feel free to move the sequence strucutres that contain "Get Current Time" around to get the delay you want.

 

Let me know if this isn't what you are looking for.Good luck!

 

 

0 Kudos
Message 2 of 6
(3,876 Views)

Hey,

Thanks for the reply,

The file which you have uploaded is in LabView 8.6 version butI'm working on LabView 8.2 version.

Seondly, I'm looking to measure the time gap once the server vi sends data packet to when the client vi receives corresponding data packet

 

Thanks in advance

0 Kudos
Message 3 of 6
(3,871 Views)

Maybe this will work better. If it didn't save for 8.2, let me know because it was acting funny when i tried to save for previous version.

 

What you need to do is make error clusters going in and out of both of your VIs. The example code I provided shows two Get Current Time VIs in two sequence structures and the error cluster going through each in series. The output of the Get Current Time VIs is converted to double representation. You need to put each of these after the operation for the timing. So I figure one will go after the server write and one will go after the client read. You'll also need to pipe these values out of the respective VI. take the two double values and subtract the values. This should be your delay.

0 Kudos
Message 4 of 6
(3,857 Views)

Hi,

I understood your logic. But it's still not fulfilling my need. I'm running server vi one one computer and the client vi on the other computer, both sending and receiving temperature date in real time.

To me it seems like, I'll have to time-stamp the temperature data with respect to one computer clock and send time as well as temperature to the other computer.

Once the other computer receives both these information it should subtract the present time of its clock with the time stamp received for the corresponding data and display it as the delay.

 

But I have no idea how to use your example to fulfill my need ? 

Looking forward for your help

0 Kudos
Message 5 of 6
(3,848 Views)

That does compliate the solution I proposed. That will work if the two VIs are on the same machine.

 

What about something like this. I'm not a TCP expert, so there may be a hole in this...

 

 

1. Server VI sends Data, and goes into a state where it listens for client to ACK receipt of data. Get current time on Server computer after this operation. This is Time 1.

2. Client VI receives server VI data and sends an ACK to sever VI.

3. Server VI receives ACK and gets current time. This is Time 2.

4.Time delayis assumed to be: (Time 2 - Time 1)/2.

 

I know that's not exact, but its the best I can think of. 

 

Additionally, if you can ensure the two computer's times are synced to the milliseocnd or beyond level, then it might be possible to use my suggestion. However, this would probably not be real time data and might need to be recorded in a log somewhere.

 

 

0 Kudos
Message 6 of 6
(3,844 Views)