From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Labview communication with C using TCP/IP

Hi, 

 

I have a custom C code that performs some calculations. I want to use Labview to perform image analysis and sensor measurement and then send data to a C program (uses custom libraries), and receive processed data back from it in real time. I am currently testing this system on Mac OSX, and compiling my c program using gcc.

 

I looked through the various options - and I wanted to implement TCP/IP with the Labview program running as a client and the C program acting as a server.

 

To get it working, I was testing out a simple labview  - C program combination (attached).The Labview program sends a number to the C program and I want the C program to print the number on the terminal.

 

My issue is that when I send a float/double to the C server, it does not display the correct number in the terminal (probably due to type mismatch). Can someone please help me track down the problem?

 

Thanks!

 

ashenoy

 

PS; compile the c program using: gcc -o server2.c

Run the compiled c program and note down the port of the server

Use localhost as the machine name and the port number of the server from the step above into the labview vi

 

 

 

 

Download All
0 Kudos
Message 1 of 8
(5,291 Views)

I do not know anything about C - I can just barely spell it. My suspicion is that the conversion of the typecast string back to a numeric value is not working correctly.

 

Can you scale the data in such a manner that you can transmit integers? With integers it would be much easier to verify that the transmission and reception of the data is correct.

 

Lynn

0 Kudos
Message 2 of 8
(5,281 Views)

Hi Lynn,

 

Ok - so I converted the number datatype to u8 in labview and made the appropriate changes in C as well - and now the program shows the correct number.

 

I guess the error might be happening where I am typecasting a double to a char so I can feed it as data to the tcp write block. Is there someway I can convert a double to char so I can input it to the TCP write block (it only accepts char type as input)?

 

Thanks!

 

ashenoy

0 Kudos
Message 3 of 8
(5,271 Views)

ashenoy,

 

Try Number to Fractional String from the String >> Number to String Conversion palette. There are several other string conversion functions so check to see if one of those better matches your needs.

 

Lynn

 

number as string.png

Message 4 of 8
(5,247 Views)

LabVIEW typecast is always assuming big endian numbers. Even on platforms that use internally little endian. Instead of a typecast you should use Flatten to String and set the endianess input to little endian. Same when receiving binary data from your C server and unflattening that. 

 

Since your MacOS X computer is almost certainly an x86 machine it's default format is also little endian. So your C program will simply stream data in little endian format too, since a C typecast doesn't change the endianess of a number. LabVIEW being a multiplatform application and originating from the 68000 Macintosh platform has always made sure to use big endian format on the stream side when flattening and unflattening data to and from native data. That allows to exchange data between different LabVIEW applications on different hardware platforms without having to bother about native endianess of the used CPU.

 

While currently only the VxWorks based RT platforms use big endian format anymore from all actual LabVIEW platforms it's not an option to suddenly change the default format. Typecast never was changed to allow specification of the desired endianess, Flatten and Unflatten got this option in about LabVIEW 8.0.

Rolf Kalbermatter
My Blog
Message 5 of 8
(5,242 Views)

Thank you for inputs, rolfk and Lynn.

 

The issue was that C was not able to handle the byte stream when I used the "Flatten to String" function. So I used the "Number to fractional string" function, and converted it back in C using the function atof(). This way, I think the solution is endianness independent. 

 

I finally got the program to work. I'll paste a diagram here in case it helps someone else:

 

Screenshot 2014-09-19 22.08.24.png

0 Kudos
Message 6 of 8
(5,188 Views)

I did this project long back. In that I was not communicating with C as you said,instead I communicated with one device in which code was written in C

--------------------------------------------------------------------------------------------------------
Kudos are always welcome if you got solution to some extent.

I need my difficulties because they are necessary to enjoy my success.
--Ranjeet
0 Kudos
Message 7 of 8
(5,176 Views)

@ashenoy wrote:

Thank you for inputs, rolfk and Lynn.

 

The issue was that C was not able to handle the byte stream when I used the "Flatten to String" function. So I used the "Number to fractional string" function, and converted it back in C using the function atof(). This way, I think the solution is endianness independent. 

 

I finally got the program to work. I'll paste a diagram here in case it helps someone else:

 

Screenshot 2014-09-19 22.08.24.png


Well, you got rid of the endianess problem and replaced it with a decimal problem. Not likely to happen if both computers use english localization settings but definitely going to be a problem if at least one of them is situated in a locale that uses decimal comma instead of decimal point!

Rolf Kalbermatter
My Blog
0 Kudos
Message 8 of 8
(5,155 Views)