LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Serial comm issues

Solved!
Go to solution

@MogaRaghu wrote:

@Yamaeda

 

Thanks. 

 

While we agree that Bytes at Port is bad, can I know why is it bad ?  Does it end up locking up the CPU resource ( which I don't think is the case as the code is responsive normally ) or what is the harm that it actually do ? 

 

 


Bytes at Port does what it is supposed to do, but it is usually the wrong tool and creates more work. For instance, I could have a 1000 byte message but read Bytes at Port before the entire message is available and only have 90 bytes at port. Now I have to go back to read the other 10 bytes. People often use delays to overcome this (wait long enough for all of the bytes to be available), but this seems to be a poor use of resources. As GerdW pointed out most instruments use a termination character, so when you read you set the number of bytes to read to a value higher than you expect. The Read will terminate when it reaches the termination character no matter how many bytes have been read.

0 Kudos
Message 11 of 13
(372 Views)

@MogaRaghu wrote:

While we agree that Bytes at Port is bad, can I know why is it bad ?  Does it end up locking up the CPU resource ( which I don't think is the case as the code is responsive normally ) or what is the harm that it actually do ? 


The main lesson from my presentation (VIWeek 2020/Proper way to communicate over serial) is to read based on the protocol.  Remember the ultimate goal is to properly read a complete message.  The Bytes At Port does not do this for you.

 

Assuming an ASCII message format, you will (or at least should) have a termination character.  So you just set the termination character at start up and just tell VISA Read to read more bytes than you expect in a message.  So VISA does everything for you to ensure you got a complete message.  It doesn't even matter the length of the message as long as it is shorter than the number of bytes you told it to read.

 

Assuming a binary/hex/raw message format, well, it gets a little more complicated.  But the bytes at port is still counter-productive.  I'll just recommend you go watch the whole presentation.

 

So the ONE use for Bytes At Port is to verify that a message has at least started to come in.  But the Bytes At Port should NEVER be used to tell the VISA Read how many bytes to read.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 12 of 13
(367 Views)

@crossrulz wrote:

@MogaRaghu wrote:

While we agree that Bytes at Port is bad, can I know why is it bad ?  Does it end up locking up the CPU resource ( which I don't think is the case as the code is responsive normally ) or what is the harm that it actually do ? 


.......

.......

So the ONE use for Bytes At Port is to verify that a message has at least started to come in.  But the Bytes At Port should NEVER be used to tell the VISA Read how many bytes to read.


I guess my understanding also was as above. Use Bytes At Port just to know bytes have started coming in and immediately switch the case to expose the Read function and tell it to read a large number of bytes ( than expected ) and stop with the Termination character. Once the Read is complete the Serial buffer is empty and the Bytes At Port senses it and switches the case to hide the Read Function. 

 

In all ways the Serial write is easy to handle but the Read is the problem - more so when I don't know when data will will come in. 

 

Sure I am studying the Examples that accompanied your presentation. 

Raghunathan
LabVIEW to Automate Hydraulic Test rigs.
0 Kudos
Message 13 of 13
(353 Views)