From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Polling on VISA Read: timeout error

Solved!
Go to solution

@irPaul wrote:

Thanks for the quick reply all!

 

@RavensFan

1. Yes, I just found this out too. I saw an earlier topic where someone solved this by not wiring the error at all. Without any error wires, the VI is interrupted upon timeout. But using error tunnels, the VI continuous to work fine.

In general, should I avoid putting an error in a shift register?

The answer is as always: it depends. If you do, you need to have some proper error handling in the loop to do something with the error and then clear it. Otherwise you could just as well abort the loop on error! Another thing you need to understand is that an error from a function is NOT always a real error in the context of your desired functionality. A timeout error often means: Yes there was no data to return, but no this is not a fatal error, please retry again! In such cases the standard procedure is to clean the error, optionally log the fact that no data was received for debugging purposes, wait a little time as it makes not really a lot of sense to hammer the serial port with another read attempt right away after nothing was received, and then loop to the next read attempt.

 

@crossrulz

Also a very good remark. I was thinking about that, but in many similar topics it was strongly, strongly advised against using Bytes at Port in combination with termination chars. In the way you do it though, it makes sense to prevent the VI from continuously polling and taking unnecessary CPU resources.


I'm one of the loudest proponents to never ever use the Bytes at Serial Port. 😀

For the way crossrulz suggested I make an exception though. 😋

There are almost always other ways to achieve the same without Bytes at Serial Port, but as long as you do not use the number of bytes for anything else than to test for >0 there isn't really much of harm done.

 

@heel wrote:

 

Issue is:

 

When VISA serial port is setup to use a termination character (for instance \n) you would expect that you always get a full line (/package) of data or a timeout. This is not the case however. The problem is that  timeout timer in VISA read.vi is not restarted when a character is received, but when the VISA Read.vi is called.

This is in so far not a serious issue, as the number of bytes to read and the timeout you select should of course be adjusted to the device in question when you write an instrument driver. That's why they can be changed in the functions. Otherwise they could just as well have left that option away. 

Rolf Kalbermatter
My Blog
0 Kudos
Message 11 of 13
(502 Views)

@rolfk wrote:

@heel wrote:

 

Issue is:

 

When VISA serial port is setup to use a termination character (for instance \n) you would expect that you always get a full line (/package) of data or a timeout. This is not the case however. The problem is that  timeout timer in VISA read.vi is not restarted when a character is received, but when the VISA Read.vi is called.

This is in so far not a serious issue, as the number of bytes to read and the timeout you select should of course be adjusted to the device in question when you write an instrument driver. That's why they can be changed in the functions. Otherwise they could just as well have left that option away. 


This conclusion is incorrect for cases where you have a device sending lines at random periods  longer than the timeout time you can afford to still have a responsive system (say  30sec).  You would not like to have a timeout time that long so you need shorter timeout time. No matter what timeout time less than the longest period between lines you pick, there is a finite probability that the timeout will happen in the middle of the reception of a line. If this happens the remaining part of this line will not give any VISA timeout but the line as a whole will be corrupted. So - if not being careful in validation of accepted lines (don't rely on absence of VISA error!) the data will occasionally be incorrect.

Lost or corrupt lines has to be dealt with at higher levels of the protocol. The suggested method solves this problem at the source.

 

You can also solve the problem by polling the number of bytes being >0 as others show _and_ ensure that the timeout time is longer than your longest line duration. This works because now you reset the timeout count in VISA Read.vi at a moment where you know there is a line coming in and it will end before the timeout.

0 Kudos
Message 12 of 13
(484 Views)

@heel wrote:

@rolfk wrote:

@heel wrote:

 

Issue is:

 

When VISA serial port is setup to use a termination character (for instance \n) you would expect that you always get a full line (/package) of data or a timeout. This is not the case however. The problem is that  timeout timer in VISA read.vi is not restarted when a character is received, but when the VISA Read.vi is called.

This is in so far not a serious issue, as the number of bytes to read and the timeout you select should of course be adjusted to the device in question when you write an instrument driver. That's why they can be changed in the functions. Otherwise they could just as well have left that option away. 


This conclusion is incorrect for cases where you have a device sending lines at random periods  longer than the timeout time you can afford to still have a responsive system (say  30sec).  You would not like to have a timeout time that long so you need shorter timeout time. No matter what timeout time less than the longest period between lines you pick, there is a finite probability that the timeout will happen in the middle of the reception of a line. If this happens the remaining part of this line will not give any VISA timeout but the line as a whole will be corrupted. So - if not being careful in validation of accepted lines (don't rely on absence of VISA error!) the data will occasionally be incorrect.

Lost or corrupt lines has to be dealt with at higher levels of the protocol. The suggested method solves this problem at the source.

 

 


NO NO NO.  The conclusion is quite correct.

 

The timeout starts counting when you execute a VISA Read.  If you monitor Bytes at Port, and see that you've received at least 1 byte, then

you execute the case with a VISA Read.    I don't care if lines come milliseconds apart, or 3 hours apart, you won't get a timeout.

 

The only way a timeout happens in the middle of a line if it takes longer for the device to finish the message once it has started to send it than what you allowed for with the timeout.

 


@heel wrote:

 

You can also solve the problem by polling the number of bytes being >0 as others show _and_ ensure that the timeout time is longer than your longest line duration. This works because now you reset the timeout count in VISA Read.vi at a moment where you know there is a line coming in and it will end before the timeout.


EXACTLY.  Which is what message #4 shows and what I am telling you.

So now I have no idea what you are debating because you are now agreeing with message #4 which is a far better way of dealing with erratic unsolicited messages from the device than what your recent example shows which does have the potential of incomplete messages and requires you to recognize that and windup discarding data.

 

Do message #4 and you won't have the incomplete messages to worry about.  Of course you should have error handling in your VI because you could have incomplete messages if somebody yanks a cable out, but incomplete messages would be very rare exceptions rather than the rule the way you are coding it.!

0 Kudos
Message 13 of 13
(468 Views)