NI TestStand

cancel
Showing results for 
Search instead for 
Did you mean: 

serial only runs once

Hi Gang,
 
I've got a nasty one.  I'm using  TS 3.5 and LV 8.2.  I'm opening a serial port in setup, using it in one step and closing it in cleanup.  The VIs run fine outside of TS, and they run fine the first time through the sequence file.  After that, the the serial read VI hangs up hard.
 
In examining the problem, I see that the serial close VI goes to the "SubVI waiting to run" state when the sequence starts, even before the serial open VI is called.  It stays in that state until I close the sequence file in the editor.
 
Has anybody else seen this?  Know of a workaround?  Am I doin' something stupid?  I would think using serial ports in TS would be a pretty common thing.  I'm at a loss about why this is happening.
 
Thanks in advance for any help.
 
Roger
0 Kudos
Message 1 of 4
(3,016 Views)
Hi,
 
Can you post a small example showing your problem?
 
Regards
Ray Farmer
Regards
Ray Farmer
Message 2 of 4
(3,012 Views)

Hi,

 

Well, the problem is solved!  After only about 5 hours of troubleshooting...(uggh)

The issue with the VI staying in "waiting for SubVI" mode or whatever, was just a distraction.  In hindsight, I don't know why that should be, but I noticed it with the other VIs that manage Visa resources.  It doesn't seem to effect the operation.

The problem is a bug with the LabVIEW Elapsed Time VI.  It works as expected under LV alone, but when run under TS, it only delays the first time it is called.  On sebsequent runs through the sequence, it does not reset.  In my application, I was using that to wait from the time characters start arriving at the serial input buffer until the whole message has arrived.  My VI was erroring due to a timeout that didn't really happen.

Attention National!!

This is a bug that should be recorded.

 

Thanks,

Roger

 

0 Kudos
Message 3 of 4
(3,001 Views)

The Elapsed Time function doesn't delay at all. You would want to use the Delay function to actually wait for some amount of time. I'm also not sure why you would need to use a Delay function either. Your entire message has arrived when the byte count you specify is reached or when the termination character is detected. The other way is to check to see if there are bytes in the buffer and keep reading until it is 0.

If, however, you believe there is a bug in the elapsed time function, you really need to post an example that demonstrates the behavior. Ni can't fix what they can't see and reproduce.

Message 4 of 4
(2,992 Views)