From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Passing Unicode string to .NET

I have an application that calls some third-party .NET DLLs. One of the methods that I need to use has a single character string input. The string type that it is expecting is a Unicode UTF16. This input is used as a tag of sorts for the data which would give me 65535 unique tags. I am aware that LabVIEW does not natively support unicode. I have installed this toolkit but I feel this has actually set me back some as I have issues with other strings coming out weirdly. I've since disabled unicode support and stuck to only ASCII characters to mitigate any issues, but I know I will need use of >256 characters in the future.

 

Does anyone have any experience using Unicode in LabVIEW that might help me out here. Would it be possible or maybe required to integrate with LabWindows/CVI to accomplish this? Any other tips or tricks for using Unicode in LabVIEW?

0 Kudos
Message 1 of 7
(4,285 Views)

Do you need to send/read/identify tags, or does display of the string as Unicode important?

Will it be possible to send regular 2 character strings? You can convert 2 chars to a single U16 number (string to bytes array -> Join numbers) and back

0 Kudos
Message 2 of 7
(4,258 Views)

I never need to display the strings, but sometimes it's good to be able to confirm that they are as expected. I don't think I quite understand your second question.

0 Kudos
Message 3 of 7
(4,217 Views)

It would depend a bit how the method exposes this string. Most likely it exports the method in a way that declares this parameter as a string. Then you have a problem since the LabVIEW .Net integration will automatically add code to convert the native LabVIEW string into the .Net Unicode string and I do not see any way to circumvent that. If it exports the parameter as an object reference, you have more options as you could create the string object explicitly  and then for instance assign the unicode contents through one of its bytearray constructors.

Since it is only using a single character from that string, I feel the developer was somehow lazy by defining this as a string and would better have chosen to use an explicit ushort instead.

Rolf Kalbermatter
My Blog
0 Kudos
Message 4 of 7
(4,213 Views)

Hello nzamora.

 

You can add this line to the LabVIEW.ini located in the LabVIEW folder in program files

 

 UseUnicode=TRUE

 

Or try following the process on this KB.

 

Displaying Non-English Characters in LabVIEW

http://digital.ni.com/public.nsf/allkb/91A1863834F4B1A0862575670067D15C?OpenDocument

 

But as you already know. This is not supported in LabVIEW and it might not work as you expected to.

 

Diego H

National Instruments

 

0 Kudos
Message 5 of 7
(4,206 Views)

It is exposed as a string, not an object reference, so not circumventing it that way. I've spoken with the developers and they said they currently do not support multi character tags, but it is in the plans for the future, thus their use of a standard string and not a ushort.

 

I had an idea to make my own C# wrapper for it that I can pass in a bytearray and it would convert it to an unicode string. Does this sound like something that might be a plausible workaround?

0 Kudos
Message 6 of 7
(4,202 Views)

In fact the strict typed interface of .Net assemblies is in this case a drawback. If it was a normal DLL function you would simply configure the parameter to be a LabVIEW 16 bit integer array in the Call Library Node and be done with it. The drawback there is that you can do lots of errors and LabVIEW is not able to prevent you from doing so.

 


@nzamora wrote:

 

I had an idea to make my own C# wrapper for it that I can pass in a bytearray and it would convert it to an unicode string. Does this sound like something that might be a plausible workaround?


Something along these lines should work (I would make it an ushort array instead of a byte array):

 

 

        public int MyWrappedFunction(ushort[] words)
        {
            char[] chars = new char[words.Length / sizeof(char) * sizof(ushort)];
            System.Buffer.BlockCopy(words, 0, chars, 0, words.Length * sizeof(ushort));
            string str = new string(chars);

            return RealFunction(str);
        }

 

Rolf Kalbermatter
My Blog
0 Kudos
Message 7 of 7
(4,196 Views)