LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

visa serial port and while loop

Solved!
Go to solution
The VISA Bytes at Serial Port is not okay. It's unnecessary, as is the wait. As you point out from the help, you can specify a high byte count. The VISA Read will just start to read and automatically terminate. The timeout setting is sufficient instead of some fixed delay in the code. The wait and available bytes can actually result in an incomplete read when the instrument sends long responses.
0 Kudos
Message 11 of 55
(3,125 Views)

I've been using the Bytes at Port, based on NI's basic examples, in my applicaton for years. It works flawlessly. It may not be "needed" in most cases, but it doesn't do any harm either, unless I'm missing something?

0 Kudos
Message 12 of 55
(3,118 Views)
As I said, it can result in incomplete results and you'll get a warning that there may be additional bytes to be read. Do you handle that or simply ignore it? If you ignore it, your code is buggy. If you are using a wait, then you are probably waiting to long on some cases. It's also something that needs adjustment from time to time. From my experience, it does not work flawlessly.
0 Kudos
Message 13 of 55
(3,109 Views)

@Edjsch wrote:

Dennis - Can you point to some examples/tutorials.

 

You notice my vi does use the LF termination character. Wiring up the Bytes at Port to Byte Count is okay, because, from LabVIEW 2009 Help:
http://zone.ni.com/reference/en-XX/help/371361F-01/lvinstio/visa_read/. "The VISA Read function returns less than the number of byte count if
the function reaches a termination character (mine is set to LF), or it reaches the end of the buffer, or a timeout occurs." (Mine is set to 1 sec.) So this is best of both worlds. If the serial string being read does not have a termination character, it will keep reading as long as there is data being received, within the timeout period.

 

Ed


Actually, your VI does NOT use the termination character.  It sets it, but it doesn't use it.  Just eliminate the bytes at port and the wait.  Then you can just leave the port open until you're done with it.

 

Set up as Dennis had suggested, you do automaitcally what you're tryig to manually - without polling the serial port and looping unnecessarilly (and using up CPU cycles) - because you have now changed the structure from polling to waiting on an event.  When the event is triggered - i.e., the VISA read encounters the termination character - it outputs the data and the loop continues.  It will also continue if there is a timeout, which triggers  timeout error, or the buffer becomes full, in which case it sends a warning that the number of bytes read equals the buffer size and there might be more data to read.  (This is actually quite useful because if you are using the termination character, that almost always means there was a problem with communication - the exception being if you specified the buffer to a number that was actually smaller then the largest chunk of data to be read.)

 

You no longer need the wait because the loop will no longer be spinning out of control.

Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
0 Kudos
Message 14 of 55
(3,095 Views)

My mantra for serial communications is - with few exceptions - if you need a hard coded wait, you aren't understanding the protocol.

Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
Message 15 of 55
(3,082 Views)

My system is for a "request / reply" protocol, not like HyperTerminal which basically listens for anything being received, requested or not. (Although I have modes in my application where it just listens, but that's a special case.) 

 

Actually, my VI does use the termination character. Because of the delay after the "reading request string" ("String to Write", usually a "?\r") and the fast, known, response time of the instruments I interface with, by the time I start reading the buffer already has data in it. For short replies such as reading a value (about 12 bytes) at high baud rates (usually 115200), the LF has already been received by the time the read occurs, and the read function returns immediately.

 

I do get your point that I don't strictly need Bytes at Port, but that was in the original examples from NI, and I saw no reason to not use it. Furthermore, some of my commands will retrieve a large, unknown amount of data, so what constant should I wire to Byte Count? What I do is keep looping as long as I'm receiving data.

0 Kudos
Message 16 of 55
(3,074 Views)

The wait also serves as the required yield to the OS in the While loop, otherwise CPU usage skyrockets.

0 Kudos
Message 17 of 55
(3,071 Views)
A real life story with hard coded waits:

A couple of jobs ago, the UUT had a serial interface and had to repeatedly perform some tasks. The time it took to perform the task was pretty variable, from several milliseconds to a second or so. A fixed delay had been in place so that meant for each and every test, a delay of 2 seconds was used. The delay had to be changed a few times with new versions of the product. This meant that when the UUT was actually ready in 100ms, the program was idle for fast longer than it needed to be. While some people might say, no big deal to waste a few seconds with each test, we had 1000 test stations and herded to test about 5 million units a year. Those extra seconds made a big difference in equipment and manpower. You may not have such an extreme situation but the code should have been written correctly in the first place and would have been if the original programmer had taken a smarter approach.
0 Kudos
Message 18 of 55
(3,069 Views)

You are correct. If the response time can be long and the amount of data that will be received is known, your way is better. Just remember to set the timeout in the VISA Configure Serial Port vi to a large enough number.

 

I did check in my vi: If I set the delay to 0 and wire a large number (1000) to the byte count, the VISA Read function does yield to the OS while it is waiting for the LF, so CPU usage is okay. What happens is that the data is retrieved from my serial device as fast as it can respond at the given baud rate. My application needs a user-defined sampling rate, which the delay provides.

0 Kudos
Message 19 of 55
(3,059 Views)

@Edjsch wrote:

The wait also serves as the required yield to the OS in the While loop, otherwise CPU usage skyrockets.


As I mentioned before, using Dennis' method changes from polling to event driven.  Your loops will no longer be spinning out of control.  No need for the wait.

 

[edit] I guess I'm a little late, huh?  [/edit]

Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
0 Kudos
Message 20 of 55
(3,057 Views)