I'm a newbie of Labview.I have try the example of basic serial port and advanced serial port, and write my own serial ports.There is one question I would like to know is about the VISA Read,actually how does this VISA Read read the data?There is one input (byte count) and one output (return count), is that means VISA Read is not read the exact bytes based on the user configure (for the case of there is no timeout or error data)?for example if the input is 20bytes,the output may not 20bytes (I have done one on my own test and experience this result sometimes).Besides,Is this VISA actually is reading the buffer from the windows OS instead of the hardware?Actually I'm quite confusing on set the timing inside the while loop.Currently, the baudrate of the hardware is 9600bps, each time send 1 frame, 1 frame contains 20bytes, frame to frame time interval is 2ms.Hope anyone can give me some guidelines or explantion on how VISA Read read the data.I'm using Labview 8.6.
There is one question I would like to know is about the VISA Read,actually how does this VISA Read read the data?There is one input (byte count) and one output (return count), is that means VISA Read is not read the exact bytes based on the user configure (for the case of there is no timeout or error data)?for example if the input is 20bytes,the output may not 20bytes (I have done one on my own test and experience this result sometimes).Besides,Is this VISA actually is reading the buffer from the windows OS instead of the hardware?Actually I'm quite confusing on set the timing inside the while loop.
You may find the better answer from some one else but here is my answer from labVIEW help for your question about return count.
My suggestions about your program. Why you have used event structure and some other stuff. Why not follow the basic example here which would be the easy one.http://zone.ni.com/devzone/cda/epd/p/id/2669
The help file pretty much explains the byte count. If the instrument you are talking to uses a termination character to indicate the end of a message then you'd configure the VISA session to use a termination character to end reads. When VISA Read sees the termination character it will stop reading. Thus, in this case you usually wire a large value to the VISA Read. If the instrument does not use a termination character then you have to disable the use of termination characters and wire an explicit value for the bytes to read. This is the case when the instrument you are talking to does not use text-based messages, but rather a sequence of hex values. You do not want the VISA Read to stop reading when it sees a linefeed character, since that could be part of the message. Based on the code you have shown it appears that you have the latter type of communication. Thus, you should be disabling termination - there is an input to the VISA Configure Serial Port VI.
As to the code: what are you doing? You are abusing local variables, and trying to write LabVIEW code as if it were text-based code. LabVIEW does not execute left to right or top to bottom or any other direction. The Test Port button seems to have a bout of schizophrenia. It seems to be for starting testing as well as changing baud rate. Your event case is handling a value change for that button, which means it will execute whether you change it from true to false or false to true. If all you're doing is simply setting up a monitor then you only need one loop. The code in the second loop can go into the Timeout case of the event structure (sans sequence frame).
Also, what are you doing with the changing of the string that you're getting from VISA Read? Your comment over the for-loop says "ASCII to hex", which makes no sense. Besides, you do not need to convert to string in order to see if a sequence of bytes exists in an array.
Thanks for reviewing my code.I know it quite confusing, not only not knowing much about my objective of this software and also my inexperience coding style.To make it more clearer, I'll explain here what's the code I'm doing.Basically,this code is only partial of my sub-vi.My hardware is a serial port server, my main objective is to collect data from multiple serial ports through ethernet.My this sub-vi code is for user to configure the settings, especially port allocation (For example Port 1 is allocated to Temperature,Port 2 is allocated to Pressure etc).My running code will tell the user which port should be used.If the user locate Port 1 to Pressure device,then there is a message to tell the user this Port is actually to Temperature.
You are correct.My data sending from the hardware is in hex format,not ASCII format.Thus I used for loop "ASCII to hex", because what the data I read from VISA Read is ASCII and I need to convert it to hex string format. For example,my hardware is sending "55(hex)" but the VISA Read is collect it as ASCII character "U". Another reason I used "ASCII to hex" is actually after that I need to do analysis, considering the text string is more easier and convenient to do analysis.The frame data(20bytes) actually contain the header and ID.I need to find the header first (BAAB) and then look for the ID. As you can see inside my while loop, I'm checking the ID and tell the user which Port it belongs to ("058B"-Port Matched!!,"058C"-This Port is Load Bank etc).I'm not sure my concept is correct when I do this code.My thinking is the Event stucture start trigger once the user click the button "TEST PORT" Or "TEST ALL PORTS", then all the setting will update local variables inside the while loop and start performing the data analysis.But at this moments I'm quite confusing on controlling the timing.I do understand this code is not running left to right or any direction.I know while loop is keep running even the user not yet clicking any button.Please give me some comments on what is the best architecture to perform this.I'm hereby attach my latest sub-vi code for reviewing purpose.
FYI,in my latest sub-vi code,user have 2 options to test the port, individually and test all automatically.Once again,thanks for your time to review my code.
One suggestion from my side is that you are not stopping the while loop when you have some unknown errors in communication. That means, you have an error wire and it indicates always the error, if some thing went wrong in either communication or other. So, use the Boolean in error cluster along with your customized stop button.
No, no, no, no, no!!!! You just made the code 6x more confusing and complicated by adding more loops instead of less! Please look over standard programming architectures, such as the state machine and the producer-consumer. In your case the producer-consumer should work. Your producer handles the UI and sends messages to the consumer to tell it what to do, like "test port 1", "test port 2". Do not create a "test all ports" message - just queue up the individual "test port x" messages. As for reading you can handle that in the consumer loop as well with a "default" case. You'd need just a for-loop to iterate through all the ports. Inside the for-loop you'd have a case structure to read/not read based on whether the port is "on".
You are correct.My data sending from the hardware is in hex format,not ASCII format.Thus I used for loop "ASCII to hex", because what the data I read from VISA Read is ASCII and I need to convert it to hex string format. For example,my hardware is sending "55(hex)" but the VISA Read is collect it as ASCII character "U".
VISA Read does no such thing. You have a misconception as to how VISA Read works. VISA Read reads bytes. It doesn't read characters or letters. That's why the input is called "byte count" and not "character count" or "letter count" or "number of ASCII characters". If VISA Read reads a single byte and it happens to be 0x55, and you put a string indicator on the output, the string indicator will display the value based on the display format setting of the string indicator. If it's set to "normal", then it assumes the bytes refer to ASCII codes. If it's set to "hex" then it displays the value as hex values. You are making the misconnection that it's simply reading ASCII characters.
Using text strings is not "more easier and convenient" - it's just a waste of CPU cycles in this case. You shouldn't need to any kind of search if you have a pre-defined header. With the conversion of the string VISA Read returns to a byte array you should know where the bytes are to look at for the port. However, if you have to search for the hex sequence BAAB then you don't need to do any kind of conversion, and you can simply work directly on the string:
Hi smercurrio_fc,thanks for your guide,.I'm studying the example of producer/consumer and working on it.Yup,VISA Read is reading bytes, just by default the indicator is displaying ASCII and user can right click and change the display into hex mode.What I refering is the output of VISA Read (read buffer pin), it is string format.Regarding the "ASCII to hex" for loop, if I use type cast follow your example code,I found my data is limited by the size of unsigned bytes, it cannot scan all the frame data in one shot, is it anything I miss out?Please see image2.jpg
Actually,the data sending from the device is asynchronous, I'm not sure when the device start send the data, thus first byte of data I received may not header hex BAAB.Therefore,my intention is to read the whole big frames (about 3 frames,each frame 42bytes,total 135bytes).From there I start to scan header hex BAAB and then ID.Please see image1.jpg.
VISA Read uses a string datatype as its output because that's the simplest way to have a buffer of unknown length. As for the buffer itself, is the string indicator you show in image1 directly connected to the output of VISA Read? What is the display format of that string indicator? Is it "normal"? It almost seems as if it is, because if it were set to hex then there should be a space after every 4 letters. If it's set to "normal", then your device is not sending hex values, but actual ASCII characters (i.e., the letter "F" as opposed to the hex value F), contrary to what you initially stated in message #4.