From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How do I stop Serial "VISA Read" from giving me packets instead of available bytes.

Solved!
Go to solution

Dear Labvillians,

 

Highlight:

How do I stop serial "VISA read" from giving me packets instead of bytes?

 

 

Background:

I have a system that serially publishes 14 byte packets on a semi-regular interval.

At busy times, the producer of these these packets queues the data, effectively producing super-packets in multiples of 14 bytes sometimes as large as 8 packets (112 bytes).

 

My protocol handler has been designed to processes bytes, packets or super-packets.

 

My application now has multiple devices and the order of message processing is critical to correct functionality.

 

My observation is that the VISA read waits until the end of a packet/ super-packet before passing the data to the application code. (See Plot Below)

 

My expectation is that VISA read should give me available bytes, and not get too smart for itself and wait for a packet.

 

Comms Code Init.png

 

Comms Code.png.

 

Message Length.png

 

 

I have observed this on PXI, Embedded PC, cFP and most recently, cRIO

 

I have experimented with the cRIO's Scan interface rate, which helps with reducing the packet backlog but doesn't resolve to sub-packet byte read.

 

I understand that one solution is to Write FPGA code to handle it and pass the bytes through R/T-FIFO, and there are some great examples on this site.

Unfortunately this doesn't help with non FPGA devices.

 

I have also dabbled in event based serial reads but it is diabolical on vxWorks devices.

 

Any Help is appreciated

 

iTm - Senior Systems Engineer
uses: LABVIEW 2012 SP1 x86 on Windows 7 x64. cFP, cRIO, PXI-RT
0 Kudos
Message 1 of 4
(2,310 Views)
Solution
Accepted by topic author Timmar

Sometimes Talking to yourself is helpful.

Comms Code Init2.png

 

Message Length 2.png

 

I hope this is a useful Nugget for someone in the future

 

 

 

iTm - Senior Systems Engineer
uses: LABVIEW 2012 SP1 x86 on Windows 7 x64. cFP, cRIO, PXI-RT
0 Kudos
Message 2 of 4
(2,304 Views)

So you did Rubber Duck Debugging. Even I have done few times and was able to solve the problems myself

-----

The best solution is the one you find it by yourself
0 Kudos
Message 3 of 4
(2,283 Views)

Yup,

 

I have also heard it called Teddy Bear Code Review.

Rubber Duck Debug sounds better in this environment.

 

 

To me, the solution was a little counter Intuitive and required some experimentation to find the correct mode.

 

Option 0 - "None":  Waits until the end of the packet before publishing - Default Mode if you disable Term Char

Option 1 - "Last Bit": Waits until the last bit of the Byte Before Publishing

Option 2 - "TermChar": Default, Assumes 0x0A if you don't wire one.

 

 Kind regards,

 

Tim L.

iTm - Senior Systems Engineer
uses: LABVIEW 2012 SP1 x86 on Windows 7 x64. cFP, cRIO, PXI-RT
0 Kudos
Message 4 of 4
(2,276 Views)