LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Serial Communication fails after a minute or so of running

Solved!
Go to solution

Hi all,

 

I'm having an issue with a simple VI that I'm starting to build for serial communication with an instrument. When I press "run continuously" the VI works for maybe 45 seconds to a minute, minute and a half. Then it stops getting responses from the instrument. When I cycle power on the instrument and restart the VI, everything is fine for another minute(ish). I attached some screenshots to better illustrate the issue. I can attach code if needed when I'm back at a computer. Did I mess something up in the VI? I based it around the simple serial example.

 

Thanks for any help!

Download All
0 Kudos
Message 1 of 25
(1,575 Views)

Do not use "Bytes at port"!  Set up your communications to use a termination character.  There are dozens of forum posts that go over this.

 

Also, do not use "run continuously"!  Just make a while loop, and use it to run one time, terminating when appropriate.  

 

I suggest you watch this video for more information:

 

https://labviewwiki.org/wiki/VIWeek_2020/Proper_way_to_communicate_over_serial

 

 

Message 2 of 25
(1,552 Views)
Solution
Accepted by Nadweb

Here's an example image demonstrating some of those points:

cbutcher_0-1619055658871.png

 

You can set the "Delay between measurements" appropriately to repeatedly poll the setpoint, and by using the Termination Character property (configured using VISA Configure Serial Port) you don't need to wait between the Write and the Read manually, and instead will always receive a "full" message.

 

Take care if your messages end in \r\n, you'll need to set the term char to the last character (\n, the default) and then trim the trailing \r character. LabVIEW has a VI that will do that for you here: Trim Whitespace.vi 

 


GCentral
Message 3 of 25
(1,508 Views)

Thank you both for the replies! That video was very informative, seems like I was doing everything he said not to do 😫 by using the simple serial example. I think I'm on the right track now, I'll know more when I'm back at my computer. I attached some screenshots from the instrument manuals. Seems like I can use \r or \n as terminators if I'm understanding correctly. Two quick follow up questions...because I always seem to have more.

 

1. How do make sure you are reading more than enough bytes?

 

2. What was causing the instrument to stop responding?

 

Appreciate all the help.

0 Kudos
Message 4 of 25
(1,492 Views)

You were probably violating the requirement not to request data more than 20 times per second.

 

Although you had a 50 msec delay in there, that means you were attempting to request data pretty much right at 20 times per second.  With the other flaws of the program like the Run Continuously and the Bytes at Port, I can see how being right at that borderline would eventually cause the device to choke on the serial communication.

Message 5 of 25
(1,450 Views)

@Nadweb wrote:

1. How do make sure you are reading more than enough bytes?


Basically, pick a bigger number if you're not sure 🙂 the cost in RAM is unlikely to prohibitive except in specific circumstances, and if you're writing code for that sort of device you probably know already!

I'd take 100, 1000, whatever... (Although here you're expecting 5 or so I think, so 20-50 is probably more than enough).

 

Regarding the manual and crashing out, it says you should not send anything for 50ms after a transmission. I guess that the message has some sending time, so the transmission of the last in iteration 'i' and the first in iteration 'i+1' could be closer than 50ms in some circumstances (depending on e.g. Windows scheduling).


GCentral
Message 6 of 25
(1,433 Views)

@cbutcher wrote:

@Nadweb wrote:

1. How do make sure you are reading more than enough bytes?


Basically, pick a bigger number if you're not sure 🙂 the cost in RAM is unlikely to prohibitive except in specific circumstances, and if you're writing code for that sort of device you probably know already!

I'd take 100, 1000, whatever... (Although here you're expecting 5 or so I think, so 20-50 is probably more than enough).

 

Regarding the manual and crashing out, it says you should not send anything for 50ms after a transmission. I guess that the message has some sending time, so the transmission of the last in iteration 'i' and the first in iteration 'i+1' could be closer than 50ms in some circumstances (depending on e.g. Windows scheduling).


I did hear something somewhere a long time ago that said that this isn't actually a buffer allocation; it just tells the read how many bytes to be expecting.

Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
0 Kudos
Message 7 of 25
(1,421 Views)

@billko wrote:

@cbutcher wrote:

@Nadweb wrote:

1. How do make sure you are reading more than enough bytes?


Basically, pick a bigger number if you're not sure 🙂 the cost in RAM is unlikely to prohibitive except in specific circumstances, and if you're writing code for that sort of device you probably know already!

I'd take 100, 1000, whatever...


I did hear something somewhere a long time ago that said that this isn't actually a buffer allocation; it just tells the read how many bytes to be expecting.


Does the allocation happen bit by bit then? Or more accurately (since I guess "bit" is ambiguous in the context of memory allocations - oops) in small chunks repeatedly being allocated and concatenated/moved? 


GCentral
0 Kudos
Message 8 of 25
(1,417 Views)

@cbutcher wrote:

@billko wrote:

@cbutcher wrote:

@Nadweb wrote:

1. How do make sure you are reading more than enough bytes?


Basically, pick a bigger number if you're not sure 🙂 the cost in RAM is unlikely to prohibitive except in specific circumstances, and if you're writing code for that sort of device you probably know already!

I'd take 100, 1000, whatever...


I did hear something somewhere a long time ago that said that this isn't actually a buffer allocation; it just tells the read how many bytes to be expecting.


Does the allocation happen bit by bit then? Or more accurately (since I guess "bit" is ambiguous in the context of memory allocations - oops) in small chunks repeatedly being allocated and concatenated/moved? 


I'm guessing it just keeps track of how many bytes to expect so it knows when to timeout.

Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
0 Kudos
Message 9 of 25
(1,404 Views)

There's a low level function called "VISA Set I/O Buffer Size" where the default (if unwired) size seems to be 4096.   Maybe that's the standard one-time initial allocation when you don't explicitly request otherwise?  (Edit: nope.  The help says that if you don't set it explicitly, the size will depend on other unspecified aspects of VISA and OS config.  But it seems to be a one-time thing during init or open).

 

As far as I could tell, there doesn't appear to be a corresponding property node to query or set.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 10 of 25
(1,398 Views)