I've written a TCP function that has both client and server functionality. It is supposed to "fail over" to another NIC card in the same PXI RT box if it looses the connection. It does this pretty well if the connection is severed by the other end , for instance changing the IP address that the remote "client connection" is trying to access causes an error 63 (network refused by server), which causes my code to close all the references and wait to reconnect. But, I can't seem to detect a network cable disconnect on the NIC that I am active on. My server code, and the two "clients", who are just listening in those loops, just timeout (error 56) which is ok, they just loop around and wait the timeout period for more data. But I have another loop that waits for outbound data on a queue, determines which port the data is for, selects the correct reference, sends the data out the "TCP write" function. It does this regardless of whether the cable is connected or not. I thought I might be able to detect the disconnect by reading the "TCP write" bytes written, but it returns the same number (the number of byted inputted) regardless of the connection. Any ideas how I can detect the network cable disconnection?
PutnamCertified LabVIEW Developer
Senior Test Engineer North Shore Technology, Inc.
Currently using LV 2012-LabVIEW 2018, RT8.5
LabVIEW Champion