03-16-2009 10:28 AM
03-16-2009 10:37 AM
CarlosSp wrote:
Yes, the idea is to to something to stop the transmission when the VI application is stopped but I don't know how, because how can I do that when I press the stop button, the port closes? And how to discard data that is in the port? The best thing would be that the port wouldn't store data, is it possible to do that? Where does data store?
I already told you how to get rid of the data, check how much garbage is there (num bytes at port) and toss it (read it and throw it away).
The OS maintains a port for the data.
I do not know if there is a way to tel the OS not to use that port after you close the VISA sesion.
Ben
03-16-2009 11:27 AM
Ben's suggestion will work also (although it leaves residue in the buffer), and it is easier to implement while not requiring an extra dependency at either end (PC's).
Kudos to you, Ben. (I can't give any from this location).
R
03-16-2009 03:54 PM
03-16-2009 06:38 PM - edited 03-16-2009 06:38 PM
Just check the number of bytes at the port upon initializing the port. If there is any data (not equal to zero), then read it.
Don't even bother wiring the read buffer. See image below:
03-17-2009 03:28 AM
Thanks a lot Ray, I'll try to put that on the port configuration. By the way, in the other read (the good one) I have a "10000000" in the bytes pad, that number doesn't matter isn't it?
03-17-2009 03:43 AM
03-17-2009 08:11 AM
You should not wire 1000000 as the number of bytes to read, because that will force the VISA Read to wait until there are that many bytes or to timeout, which is usually up to 10 sec depending on how you set your timeout value.
The portion of code that appears after the configure port... You can use the same code and put it in the while loop that runs during the communication transactions. At the end (outside the while loop), read the port one more time just before closing it. By implementing this, your LabVIEW portion of the code should be more stable. You may then need to look at the VB portion of the code to do something similar. The idea is to try to have a relatively clean buffer at both ends to avoid having bizzare characters when re-starting the application.
R
03-18-2009 10:48 AM
Thanks Ray, then what I have to put in the "byte count" pad in read's command? Because bytes change depending of the data transmited.
I found a command that is perfect for read and toss data, it's the "flush buffer", I have to put it as you say, to purge the port when I want to stop the communication or restart it.
By the way, how do I change the waiting time of VISA read?
03-18-2009 11:53 AM - edited 03-18-2009 11:55 AM
As I mentionned in my previous post, you need to put the following inside the while loop that takes care of the communication:
property node for the number of bytes available at the port.
case structure that contains:
a) value 1 (default) : the VISA serial read wired with the serial data output (leaving the case structure)
b) value 0 : wire through the VISA reference, error cluster and wire an empty string constant to the tunnel for the serial data output
What I described above in shown in the image I posted.. The only difference from the image way above and what should be in your while loop is to remove the configure serial port which appears at the left of the image. The property node "Bytes at Port" will give you the number of bytes to read. That is what would replace your byte count.
R