LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

VISA and READ function - Problem linked to the bytes at port

Hello there,

 

I've tried to find a problem to my solution, and I've read many threads that seems to describe the same kind of problem, but so far no solution to my issue.

 

I am new to labview, so I'll try to make it short and clear: I am trying to communicate with a pump, and I'd like to initialyse my numerical boxes with the values that are already set on the pump (for example, when I switch on the pump, the set flow rate may be 8mL/min, and I'd like to have 8 written on my labview controller when I run it the first time).

 

Pretty easy eh? So, it works half of the time. I don't fully understand why, but sometimes the bytes at port is "28", and that will make my read functions to correctly read the values, and some other times the bytes at port are... whatever number BUT 28, and then I'll get errors.

 

I have joined my VI, the focused piece of work is under the event Timeout. I've read somewhere on an NI page that doing this piece of work under a TimeOut event is a way to initialyze a controller.

 

Also, as I am new to labview, and if you feel like my code need to be corrected, fell free to critizise it. I am happy to learn from you 🙂

 

Cheers, 

Flo

0 Kudos
Message 1 of 114
(3,547 Views)

Hi,

 

you should never wire bytes at port to your Visa read, because there is a chance that only half the bytes are currently at the port when the bytes at port function happens.  One thing that could fix this issue would be to wire 28 to the bytes to read on the visa read, because that works, and remove the bytes at port function.

 

Questions:

 

1. How does your device output data, do you know the protocal?

2. If you can control the output from the device can you give it a termination character?



-Matt
0 Kudos
Message 2 of 114
(3,511 Views)

If you expect to read 28 bytes, then that is what you need to wire to the VISA Read.  Very rarely is the Bytes At Port the correct way to tell the VISA Read how many bytes to read.

 

But do tell us more about the protocol (the format of the strings being sent to you).  There are often simpler and more robust ways to get your data.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 3 of 114
(3,497 Views)

Hi,

 

Thank you for your replies.

 

So, I just discovered the breakpoint setting. It appears that the reading changes everytime I run the software. The first time will be ok, the reader will get the correct and expected response from the pump, but then it will shift the responce by few characters:

 

1st time: "OK, 0523, 0012"

2nd time: "K,0523,0012,O"

3rd time: "0523, 0012, OK"

 

and so on..

 

Is there a way to clear that ? let's say, when the order has been sent to the pump, can we clear what appear to be a memory ?

 

edit: the expected format for the command "CS" (firt read function) is

OK,00.00,10000,0000,psi,0,0,0/  ...so 30 bytes is assume. But somehow the response is still shifted.

 

Update: some of these values may vary in lenght: 10000 may be 5000 or 300 (that's my pressure limit, but I expected the format to be 5 digits...10000 or 05000 or 00300, apparently not). So the amount of bytes can varye depending on the output of the read functions. 

 

Cheers,

Flo

0 Kudos
Message 4 of 114
(3,468 Views)

Many (most?) devices sending VISA data use a termination character (usually <CR>, hex 0D, or <LF>, hex 0A) at the end of the string (you might not see them on your screen, as these are "non-printing characters".  When you set up VISA, you can configure it to read until it finds a termination character.  You then do the following:

  1. Tell VISA to read, say, 1000 characters (knowing the "expected" string length is 20-40).
  2. VISA reads, say, 32 characters, stopping at the "Termination character".
  3. You ask VISA how many bytes it read.
  4. You process exactly that many bytes.
  5. Repeat until you are done.

The key to using "bytes at port" is to not use it until you know you have a complete line.  This should keep your strings "in synch", with "OK" always appearing in the correct spot.

 

Bob Schor

0 Kudos
Message 5 of 114
(3,458 Views)

I see your solution. It seems to be complex, where I actually expect a precise amount of bytes. My command right now will receive always 11 bytes. and so far it works all the time, so I've put "11" as the input of mu read function.

 

I have narrowed the problem down to the fact that if I don't specify the EXACT amount of bytes I am going to receive, I get an error. The annoying part is that if, says, my pressure is 10000psi, Ill get 10000 as a response (5 bytes), if it's 9999, I'll get 9999 (4 bytes).

 

I am looking for a way to tell Labview to be more flexible with this... so far nothing... 

 

Your solution may help me, but I really have no idea on how to ask VISA how many bytes it reads before sending that information to the READ function...

 

 

Flo

0 Kudos
Message 6 of 114
(3,456 Views)

What I do, and it has worked for me every time so far, is to put the bytes at port property node inside the loop so it rechecks this value every loop.  Then wire that value into your visa read vi and it will only read the amount of bytes at the port.  

0 Kudos
Message 7 of 114
(3,442 Views)

@NiAchilles wrote:

What I do, and it has worked for me every time so far, is to put the bytes at port property node inside the loop so it rechecks this value every loop.  Then wire that value into your visa read vi and it will only read the amount of bytes at the port.  


Thx for the idea. The problem is that I don't know the lenght of my respense when I run my VI. Maybe I could check your VI to see if that could fit for my setup?

 

Flo

0 Kudos
Message 8 of 114
(3,438 Views)

Bytes at Serial Port is in 99.999999% of the cases the totally wrong function to determine how many bytes you should read. You have no way to know when the device has sent all data to the computer so when you request bytes at serial port you may be in the middle of the message, or there might be even several messages in the buffer already.

 

If your device has variable length messages, which it seems to have (your example about 10000 having 5 characters and 9999 having 4 indicates that at least, although you claim in the same post earlier it always sends 11 bytes and in an earlier post that it sends 28!) it SHOULD provide a termination character as explained by Bob. If it doesn't, and that is VERY, VERY unlikely, you have every right to throw it in the trash and be VERY angry at the manufacturer of it!

 

Bob's explaination was maybe a little more complicated than strictly necessary, because all you have to do is to configure the right termination character when initializing the port and then simply call VISA Read with a bigger number of characters to read than you expect the biggest possible message to be. VISA Read will ALWAYS terminate if any of these cases is true:

 

1) an error occurred in the underlaying IO Passport driver

2) the number of requested bytes has been received

3) the configured termination character, if any was configured and "end on term char" is enabled, was read 

4) the timeout (an inherent attribute of the VISA resource) has occurred

 

Ditch that Bytes at Serial Port ASAP!!!!  Smiley Mad

 

It is ALWAYS the wrong function to use except for detecting that there are any characters available or in special low level serial protocols, that tend to always involve an extra byte cache and a complicated state machine logic too, to handle all the eventualities of the protocol parsing.

Rolf Kalbermatter
Averna BV
0 Kudos
Message 9 of 114
(3,431 Views)

Hi Rolfk

Very neat reply!!

 

I am discovering the READ function as I am posting, so since the beggening of that thread my vision of the code may have changed a little bit, so does the code. I unlinked the bytes at port wire. dont worry !!

 

soooo, my termination character is a"/", I just don't get how to split my string until that character. There may be a function, I'll check that and edit that post later 🙂

 

problem: if I specify a 12 bytes as input for my READ function, but the response from the device contains only 11 bytes or less, the READ function will throw an error. Is that bad ?

 

Flo

 

 

0 Kudos
Message 10 of 114
(3,423 Views)