cancel
Showing results for 
Search instead for 
Did you mean: 
Reply

What is the best way to convert a cluster into byte array or string

Solved!
Go to solution

What is the best way to convert a cluster into byte array or string

I'm writing a program that sends UDP packets and I've defined the data I want to send via large clusters (with u8/u16/u32 numbers, u8/u16/u32 arrays, and nested clusters). Right before sending the data, I need to convert the clusters either into strings or byte arrays. The flatten to string function is almost perfect for this purpose. However, it's appending lengths to arrays and strings which renders this method useless, as far as I can tell. 

 

As I have many of these clusters, I would rather not hard code the unbundle by names and converting/typecasting to byte arrays or strings for each one. 

 

Is there a feature or tool I am overlooking? 

 

 

Thank you! 

0 Kudos
Message 1 of 10
(2,117 Views)

Re: What is the best way to convert a cluster into byte array or string

Hi mkirzon,

 

what about TypeCast?

 

You still have to know the datatype at the receiver side: you should always prepend some message header to signal the content of the UDP message…

Best regards,
GerdW
CLAD, using 2009SP1 + LV2011SP1 + LV2017 on Win7+cRIO
Kudos are welcome Smiley Wink

0 Kudos
Message 2 of 10
(2,103 Views)

Re: What is the best way to convert a cluster into byte array or string

Why is it a problem that it's prepending the string/array size? The unflatten function should still be able to decode it back into a cluster. There's also an input to the flatten function which can turn off the prepending of string/array size to the string.

 

You can also type cast to a string or an array of bytes I believe as well.

 

If you're using LV2013, you can flatten to JSON which is a more human readabale format (but obviously not as efficient in terms of data transferred).

 

When we do TCP/UDP reads, we normally prepend an integer with the packet length so we know how much data to read from the port for our message. We wait for 4 bytes, convert that to an integer and then read that many bytes to get the full message.



Certified LabVIEW Architect (CLA), Certified LabVIEW Embedded Systems Developer (CLED) and Certified TestStand Developer (CTD)
MediaMongrels Ltd. - NI Alliance Partner | GDevCon :: 4-5th September 2018 :: Cambridge, UK
0 Kudos
Message 3 of 10
(2,042 Views)

Re: What is the best way to convert a cluster into byte array or string

Anything embedded in the cluster will have the lengths in there when you flatten.  But is that really an issue?  How else are you going to be able to tell how many elements are in your array or bytes in your string?  That information is there because it is important.

 

Just use the Flatten To String and Unflatten From String and be done with it.  An extra 4 bytes added to an array/string will be minor if your clusters are as large as you say.


There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
0 Kudos
Message 4 of 10
(2,013 Views)

Re: What is the best way to convert a cluster into byte array or string

Thanks for responses!

The program I'm developing is for communicating/polling an external device that has a well documented protocol. The messages I constructed in clusters contain the exact components the other device expects. Thus, if my program appends any additional info (such as these array/string lengths), the external device's parser will not recognize the messages and return an error. 

 

In regards to using typecast, can you send a cluster through a typecast? I haven't been able to do so - the terminal and sink are incompatible. 

0 Kudos
Message 5 of 10
(1,979 Views)

Re: What is the best way to convert a cluster into byte array or string

Flatten to string has a boolean input of "Prepend string or array size" ... The default value is true.

 

 

0 Kudos
Message 6 of 10
(1,965 Views)

Re: What is the best way to convert a cluster into byte array or string

Unfortunately, when the cluster contains arrays within it, that boolean control only affects the top layer - any arrays/strings in it or in its nested clusters will still get an appended length. 

0 Kudos
Message 7 of 10
(1,960 Views)
Solution
Accepted by topic author mkirzon
08-27-2015 04:09 PM

Re: What is the best way to convert a cluster into byte array or string


mkirzon wrote:

The program I'm developing is for communicating/polling an external device that has a well documented protocol. The messages I constructed in clusters contain the exact components the other device expects. Thus, if my program appends any additional info (such as these array/string lengths), the external device's parser will not recognize the messages and return an error.


Unfortunately, you are going to have to unbundle your cluster and build up the individual strings to concatinate.


There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
0 Kudos
Message 8 of 10
(1,934 Views)
Solution
Accepted by topic author mkirzon
08-27-2015 04:09 PM

Re: What is the best way to convert a cluster into byte array or string

An alternate approach is to replace the embedded arrays with clusters containing exactly the right number of elements. Since the cluster size is fixed, even a nested cluster will not have a size prepended. This will also allow you to use Type Cast. You can put a cluster through Type Cast, but only if the cluster contains no variable-sized elements (strings or arrays).

Message 9 of 10
(1,921 Views)

Re: What is the best way to convert a cluster into byte array or string


deceased wrote:

Flatten to string has a boolean input of "Prepend string or array size" ... The default value is true.

 

 


That only specifies if a string or array size should be prepended if the outermost data element is a string or array. For embedded strings or arrays it has no influence. This is needed for the Unflatten to be able to reconstruct the size of the embedded strings and arrays.

 

The choice to represent the "Strings" (and Arrays) in the external protocol to LabVIEW strings (and arrays) is actually a pretty bad one unless there is some other element in the cluster that does define the length of the string. An external protocol always needs some means to determine how long the embedded string or array would be in order to decode the subsequent elements that follow correctly.

 

Possible choices here are therefore:

 

1) some explicit length in the protocol (usually prepended to the actual string or array)

2) a terminating NULL character for strings, (not very friendly for reliable protocol parsing)

3) A fixed size array or string

 

For number 1) and 2) you would always need to do some special processing unless the protocol happens to use explicitedly 32 bit integer length indicators directly prepended before the variable sized that.

 

For number 3) the best representation in LabVIEW is actually a cluster with as many elements inside as the fixed size.

 

 

Rolf Kalbermatter
Averna BV
LabVIEW ArchitectLabVIEW ChampionLabVIEW Instructor
Message 10 of 10
(1,884 Views)