LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

TCP/IP C# with Labview

Solved!
Go to solution

hello everyone,

 

At the moment i'm trying to send a String from C# to Labview however Labview gets an empty string he does notice that something is send as the listener simply replies and i've checked within C# itself and there i get the right format of string back i'm not sure what i'm doing wrong i'll attach my projects so you guys can have a look

 

code sample as i can't attach the .cs:

 

// Create a TCP/IP socket.
Socket client = new Socket(AddressFamily.InterNetwork,
SocketType.Stream, ProtocolType.Tcp);
try
{
   client.Connect("127.0.0.1", 8222);
}
catch (Exception ex)
{
     MessageBox.Show(ex.ToString());
}

     byte[] outStream = new byte[10025];
     outStream = Encoding.ASCII.GetBytes("je krijgt iets terug");

 

     client.Send(outStream);
     MessageBox.Show("data send");

 

end of sample.

 

thanks in advance,

Rinus1993

0 Kudos
Message 1 of 12
(5,021 Views)

Hi,

 

did you try to send the String using one of  the LabVIEW shipped examples?

The TCP/IP shipped examples are located in :

 

Program Files\National Instruments\LabVIEW XXXX\examples\comm\TCP.llb\Data Server.vi

 

Where XXXX is your LabVIEW version

0 Kudos
Message 2 of 12
(5,013 Views)

well yeah and that works i think the big problem is that C# sends over byte arrays while Labview doesn't or at least some other encoding is used...

0 Kudos
Message 3 of 12
(5,009 Views)

because when i put a breakpoint he does come into the read function so he does hear stuff he just has absolutely no clue what he hears and i'm using labview 2011

0 Kudos
Message 4 of 12
(5,007 Views)

I think you have to read the bytes header then message

0 Kudos
Message 5 of 12
(5,000 Views)

what you mean by that is this in C# or in Labview? because i tried type casting but Labview's pretty convinced his output is a string

0 Kudos
Message 6 of 12
(4,998 Views)

I mean in LabVIEW

 

Using TCP Read VI if nothing is connected to the Bytes to Read, the function will not report a timeout error.

0 Kudos
Message 7 of 12
(4,993 Views)

 

 

The default value for Bytes to Read is 0, so the VI is looking for 0 bytes and is not throwing an error because if there are no bytes coming in, it is reading the expected value.

0 Kudos
Message 8 of 12
(4,992 Views)

ah ok like that so you think it should work if i define a constant 0 bytes on the read function i indeed didn't connect that

0 Kudos
Message 9 of 12
(4,989 Views)
Solution
Accepted by topic author Rinus1993

No.

 

The Bytes to Read should be set to a large number because the function will read FROM the number of bytes specified at Bytes to Read.

 

Try with 10025 bytes

Message 10 of 12
(4,983 Views)