LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Bytes at Port Property Node. When to use it and when not?

Solved!
Go to solution

I seem to have woken a few peolple up 😁

 

So I just tried to recreate the simple write and read example shown in

VIWeek 2020/Proper way to communicate over serial

Thank you RTSLVU.

 

I am finding that attempting to read the port when there are no bytes available produces an error that prevents further writes or reads. Error -1073807339.

It is necessary to stall the read until the bytes are available.

 

To get it to work I had to create a delay between write and read, the minimum delay that appears to remain stable is 230ms and running the vi to collect 1000 readings takes 232 seconds. I actualy setup a timer to measure that time and it makes perfect sense, 230ms X 1000 = 230seconds and it collected 1000 readings. This is great for getting readings at consistent intervals but for my application there is an unacceptable lag between operating the DTI and the on screen response.

My method that uses the dreaded 'bytes at port' does not rely on delays or timers. As soon as the data is all at the port it is read, as soon as it is read the command is written etc etc. It does need a 30ms wait time for each iteration but I get 1000 readings in 138 seconds.

 

I don't know if I am still missing the point somewhere but it appears to me that reading the data before it is available produces an error and reading the data immediately after bytes at port says there is data available can produce truncated results so it still requires a delay to ensure the whole data package is ready.

 

The equipment I am using - USB-Serial adapter, Mitutoyo Mux box and Mitutoyo DTI.

 

I have struggled quite a bit to get a stable solution that gives a stream of results and never produces eroneous readings. My objective is to avoid lag and have the feeling that the data appearing in LabView is live, anything over 200ms is unnaceptable. I have used this for reading 10 or more devices simultaneously with very little lag.

For collecting single readings the delay is not a problem, even 500ms is ok but I still tend to use the same method.

0 Kudos
Message 21 of 46
(1,475 Views)

It’s easier to filter the timeout error after VISA Read. If that function times out it simply means that there was no data yet (or not enough and the data that was there simply stays in the VISA session buffer). No need to work with delays that are always either to short to work 100% or way to long for 99.9% of the cases.

I did use Bytes at Serial Port too in a grey past, but soon found that I was always ending up with very elaborate VIs that did their own buffer handling and termination character encoding. And it required generally delays too. But if you start putting delays in communication functions for anything else than giving CPU up while waiting, you are doing it wrong. And that giving up CPU is already done by VISA Read when it waits for the requested data to arrive, so if using VISA and you end up using delays you are doing it wrong. Yes I know slapping a delay between two functions seems like a quick and easy fix, but it always will bite you sooner or later as the instrument may get slower with a new firmware or the communication infrastucture changes such as using a different interface type, PC or OS.

Rolf Kalbermatter
My Blog
0 Kudos
Message 22 of 46
(1,470 Views)

Thank you Rolf I am sure your right because so many knowledgeable people agree but I haven't made it work yet.

 

I have spent a couple of hours this morning trying. I know I can clear the errors but the fact it has timed out has introduced a lag.

I need to get back to my current project but I will try this again when I get time to do so, I do want to get it right but I also don't see why the way I have done it is so wrong. It works, its quick, its error free and doesn't produce a race condition as far as I can see.

 

 

 

0 Kudos
Message 23 of 46
(1,467 Views)

@STTAndy wrote:

I don't know if I am still missing the point somewhere but it appears to me that reading the data before it is available produces an error and reading the data immediately after bytes at port says there is data available can produce truncated results so it still requires a delay to ensure the whole data package is ready.


If you use the Bytes At Port to state how many bytes to read, then you run into this issue.  You need to know the protocol of the device in order to do a proper VI, knowing how many bytes you need to read for the message whether it is using a termination character, set message lengths, or length as part of the message (or implied by the command ID).


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 24 of 46
(1,462 Views)

@STTAndy wrote:

 

 

I am finding that attempting to read the port when there are no bytes available produces an error that prevents further writes or reads. Error -1073807339.

It is necessary to stall the read until the bytes are available.

 

That is a timeout error. It just means that the entire message was not at the port before the timeout. Just set your timeout to be longer and you will have your data as soon as it is available without the timeout error. No need for bytes at port.

 

I, too, am a former user of bytes at port, but have found that most of where I was using it was actually making my program less efficient.

0 Kudos
Message 25 of 46
(1,455 Views)

In my defense all the devices I use give out fixed length data for the readings with a couple of exceptions. I often request other data from these devices for example serial numbers that I use for calibration status etc but the response time for those is not critical.

 

For the application I am working on I need a constant stream of readings from devices that only provide one reading per request so the cycle of send command and read data needs to be slick.

I won't stop trying 😀

0 Kudos
Message 26 of 46
(1,452 Views)

I think we need to look at the communications protocol.  What you're describing doesn't seem unusual, and doesn't seem to need bytes at port, at any rate.

 

Reading a serial port doesn't even have to be especially "slick" as your code will mostly be waiting around to read stuff.

Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
0 Kudos
Message 27 of 46
(1,446 Views)

Thank you to everyone that has responded. I will endevour to find a way to ditch the bytes at port since it is clearly wrong to use it but for now I need to move forward. It is working for me at the moment, I just won't show my shoddy code to anyone 🤐

 

See you all back here in another 4 years 🤣

0 Kudos
Message 28 of 46
(1,436 Views)

@STTAndy wrote:

 

For the application I am working on I need a constant stream of readings from devices that only provide one reading per request so the cycle of send command and read data needs to be slick.

 


This is exactly the reason not to use bytes at port. If you have known length responses just read that number of bytes using a long enough timeout that you know you will get the data in that time. It takes longer to read bytes at port and then read the data.

0 Kudos
Message 29 of 46
(1,423 Views)

@johntrich1971 wrote:

@STTAndy wrote:

 

For the application I am working on I need a constant stream of readings from devices that only provide one reading per request so the cycle of send command and read data needs to be slick.

 


This is exactly the reason not to use bytes at port. If you have known length responses just read that number of bytes using a long enough timeout that you know you will get the data in that time. It takes longer to read bytes at port and then read the data.


OK I said I was moving on but I couldn't leave it alone. I have got it working consistantly without 'bytes at port' Yipee

I now have a simple loop that writes the command and then reads the data without truncking and no timeouts.

The while loop has 190ms to wait before each write and read and if I try to reduce that time I start to get occassional timeouts. It actually seems a little slower than my original method which was not consistent but between 120ms and 180ms per reading. It feels responsive enough so I am happy.

 

I have been converted 👍 Thanks guys for persevering, I have learnt stuff today while listening to ZZ Top. RIP Dusty Hill

0 Kudos
Message 30 of 46
(1,416 Views)