02-02-2012 03:24 PM
Hi all,
I am experimenting a strange behaviour of LabVIEW (v2011 64bit) when I try to pass it a byte array from a c# assembly written in Visual Studio 2010.
The output data type of the c# method I call is correctly defined ad byte[] into the assembly and it is correctly dimensioned and filled by characters acquired from a serial port (when I debug the assembly code, I see that the characters I receive on the serial port are correctly stored into the byte array prior to pass them to LabVIEW).
The problem is that, when the array arrives in LabVIEW it appears to be an array of U16 instead of an array of U8 as expected!
Is there anyone that can help me with some hint to solve this problem?
I am new in developing LabVIEW VIs that talk with c# assemblies and I don't have a sufficient experience to solve this problem; any help will be appreciated.
02-07-2012 08:11 PM
02-07-2012 11:21 PM
Dear Mike,
thank you for your reply.
The data bytes in the erroneous U16 output array were arranged so that couples of adjacent bytes in the original byte array were mixed in the same 16-bit U16 number in a sort of "big-endian" coding style; as an example, if the original byte array coming from the invoked C# assembly method were [0x01, 0x02, 0x03, 0x04, ...], LabVIEW returned [0x0201, 0x0403, ...].
However in the meanwhile I already found a workaroud to the problem: now I simply preallocate a byte array of the expected return data size and pass it as an additional input argument to the C# assembly method that simply fill it with the data coming from the external peripheral.
Nonetheless, if possible, I would like to know the reason of the previous strange behaviour of LabVIEW.
Thank you again for your willingness.
Regards
Francesco - Italy
02-08-2012 08:22 AM
How are you calling the C# code? From your last poet it sounds like you are using a Code Interface Node. If that is the case, then what you described as your fix isn't a workaround. You should always pre-allocate memory space when returning values of undeterminate size (arrays, strings). If you don't you can start getting odd random crashes, sometimes right away, sometimes when you go to shutdown LV.
A second issue I just thought of is that there might be a terminology problem here. Back in the day, 8-bit values were called integers, 16-bit values were long integers and 32-bit values were double integers. Somewhere along the way though, compilers started calling 16-bit values integers with 32-bit values becoming long integers and 64-bit values being dubbed double integers. Under this neoterism, 8-bit values became short integers.
This all goes to point out that defining datatypes in other languages is sometimes an inexact art.
Mike...
02-08-2012 08:42 AM
If the OP is using the CLFN, then that's definitely wrong, since that's not the right way to call .NET assemblies. The .NET functions should be used.
Without seeing some code, I suspect the guess about the terminology is a likely candidate here.
02-09-2012 08:48 AM
Obviously I am using the .NET assembly using the "Constructor Node" and the "Invoke Node" available in the .NET Connectivity palette; also I am sure that the datatype “byte” in C# corresponds to an 8-bit code.
However, to dispel any ambiguities, I attach to this message an archive with the source code of the C# assembly (which I further developed in MS Visual Studio 2010 Professional starting from the code attached to this post: http://forums.ni.com/t5/LabVIEW/C-Constructor-being-called-multiple-times-when-I-run-my-VI/td-p/1683... and some LabVIEW VIs developed using LabVIEW 2011.
The purpose of the C# assembly is to provide a simple way to interface LabVIEW to the Bus Pirate v3, that is an open hardware tool useful to connect a PC with a large variety of serial-enabled chips/peripheral (http://dangerousprototypes.com/bus-pirate-manual/)
The LabVIEW project in the archive contains a VI that serve as a basic communication tester (that is almost the same of the original work I found at the link above) and some VIs I developed to communicate with some I2C and SPI ICs (a Microchip’s MCP23017 16-bits I2C I/O expander, a MAXIM’s MAX31855 Cold-Junction Compensated Thermocouple-to-Digital Converter and a STM’s SLM303DLM 3-axis accelerometer/magnetometer).
However, please note that preallocating the byte array for the expected answer (readThenWrite.vi function) has solved my original problem; I post the code here only to try to understand the origin of the problem itself and to give to someone else the opportunity to use the Bus Pirate using LabVIEW.
Please also note that the attached code is not complete at all; I will continue to further develop it day by day to adapt its characteristics to my needs.
Any comments/hints will be appreciated.
Francesco – Italy
02-09-2012 09:06 AM
The signatures of your functions indicate that the byte arrays are not outputs of your functions, as you initially claimed. They are input to the functions, or rather arguments of them. That's a very important distinction. Since they are arguments, the .NET assembly is expecting the calling program to allocate the memory for these buffers. Thus, you *have* to pre-allocate in LabVIEW. This is by design.
02-09-2012 11:19 AM
Dear smercurio_fc,
please note that in this version of the method writeThenRead the byte array "answer" is an input argument but in its first version (the one that originally caused me the problem with the returned data type) it was an output argument; in fact the function prototype was something like:
byte[] writeThenRead(byte[] command, int expectedAnswerLength, int timeout)
Debugging the assembly code in Visual Studio I saw the byte array leaving correctly the assembly method but, when it came in LabVIEW, it appeared to be an U16 array with the original bytes mixed in a sort of little-endian coding; moreover forcing the U8 representation in LabVIEW caused a loss of data.
With this new version of the writeThenRead method I solved my problem but I am curious to know the reason of that strange behaviour of LabVIEW.
Thank you for your interest in this problem and for any other suggestion you would give to me.
Francesco
02-09-2012 12:07 PM - edited 02-09-2012 12:07 PM
That wasn't at all clear from your messages. OK, I changed the function to this:
// Write a command to serial port then wait answerBuffer.Length bytes for a maximum of timeout milliseconds
public byte[] writeThenRead(byte[] command, int expectedAnswerLength, int timeout)
{
return new byte[5] {0,1,2,3,4};
}
and recompiled the assembly. I loaded it up in LabVIEW 2011, but 32-bit on Windows XP. I placed a constructor followed by the Invoke Node, calling that method. When I right-clicked on the "writeThenRead" parameter (i.e., the output), I got a U8 array created, not a U16. I do not know if this is due to a 64-bit issue. I will have to see if I can get a 64-bit platform running later to test this.
Aside: I'm not sure why you're going to all this trouble in the first place. Looking at the C# code, all you're doing is sending byte arrays over the serial port. You can do this in VISA just as well, and much more easily:

02-10-2012 02:36 AM
Sorry, I thought that my first post was clear enough; evidently I was wrong.
The prototype of writeThenRead you implemented is exactly the same of the one that created me the original problem! At this point I believe that it was due to the 64-bit architecture of my operating system.
As for VISA, in this project I have avoided its use simply because the goal is to use the assembly with any language that is able to access .NET assemblies, not only with LabVIEW; so I want to focus at the assembly level all communication capabilities with the hardware.
Thanks again for your kind support; greetings