Instrument Control (GPIB, Serial, VISA, IVI)

cancel
Showing results for 
Search instead for 
Did you mean: 

How do you flush a device's buffer using LabVIEW GPIB functions?

I know the VISA Flush function is an alternative, but I started my program with the low level GPIB functions and it would be a huge task switching now, so if anyone knows how I would do the buffer clear I would greatly appreciate it if you can show me how it's done. Thanks.

Otman
0 Kudos
Message 1 of 7
(4,895 Views)
Hi Otman,
Are your referring to the device transmit or receive buffer? If you are referring to the transmit buffer, you could read from the device (and disregard the response) and that would in essence clear the buffer. There is no way to flush the receive buffer. If this does not answer your question, please give me more information about what you want to do.

Have a great day!
Brooks W.
National Instruments
0 Kudos
Message 2 of 7
(4,895 Views)
Brooks,
Thanks for your reply.
Your answer does make sense, and I did was referring to a device transmit/receive buffer.
Now does LabVIEW has a limit on how much data it can hold while a particular process is running? And if so, how do you make it clear up its memory so that it doesn't get clogged up on the indicator side. You see my problem is that I have a device buffer data going into a LV indicator which after sometimes gets clogged up with data that it doesn't show meaningful data any more. Thanks.
Otman
0 Kudos
Message 3 of 7
(4,895 Views)
Hi Otman,
You need a property node to clear out the indicator.
Right click on the indicator and choose create property node. Left click the property node and set the property node to the "value" property, right click on the property node and switch to "change all to write." Finally, right click on the property node input and choose create a constant. The constant should be an empty string constant which will clear the indicator.

Hope this helps!
Brooks W.
National Instruments
0 Kudos
Message 4 of 7
(4,895 Views)
What if the indicator is numeric!
0 Kudos
Message 5 of 7
(4,895 Views)
Your comment about an indicator getting "clogged up with data" doesn't make any sense. The contents of a string or array indicator can get large and slow things down but a numeric indicator cannot get clogged up. If the data stops making sense, then you are incorrectly reading the instrument and converting that data to a numeric. With your comments about the device transmit buffer, I suspect you have occasionaly set the byte count too low and unread bytes are there that you then read the next time. As long as the instrument is fairly new, it will send out a termination character (typically EOI) that will terminate the read. You can then set the read count to some arbitrarily high number to ensure you've got the entire tr
ansmit buffer contents. It's also possible that you periodicaly have an error condition where the instrument is sending unexpected information. For example, if it normally sends a floating point number as a result and then it sends an error message string, you might not be intrepreting it correctly.
0 Kudos
Message 6 of 7
(4,895 Views)
Good! That's why I posted it so I can get feedback from more expreienced labview programmers like yourself. That way I can get more comfortable with the language which needless to day I am new to by force on the part of some folks in my department who push for standardization in labview. So bear with me while I get onboard to this new language and try to digest 20 years of LV development into a few weeks (by way of courtersy of the advent of technology of course).
Wished I was working with a fairly new instrument but I am not. My instrument is over 25 years old. Support from Agilent (formerly HP) is practically non-existent on this device.
Anyway, thanks for your comments. I'll probe this device deeper with more testing
of its behavior.
0 Kudos
Message 7 of 7
(4,895 Views)