04-21-2021 04:28 PM
Hi all,
I'm having an issue with a simple VI that I'm starting to build for serial communication with an instrument. When I press "run continuously" the VI works for maybe 45 seconds to a minute, minute and a half. Then it stops getting responses from the instrument. When I cycle power on the instrument and restart the VI, everything is fine for another minute(ish). I attached some screenshots to better illustrate the issue. I can attach code if needed when I'm back at a computer. Did I mess something up in the VI? I based it around the simple serial example.
Thanks for any help!
Solved! Go to Solution.
04-21-2021 04:51 PM - edited 04-21-2021 04:52 PM
Do not use "Bytes at port"! Set up your communications to use a termination character. There are dozens of forum posts that go over this.
Also, do not use "run continuously"! Just make a while loop, and use it to run one time, terminating when appropriate.
I suggest you watch this video for more information:
https://labviewwiki.org/wiki/VIWeek_2020/Proper_way_to_communicate_over_serial
04-21-2021 08:43 PM
Here's an example image demonstrating some of those points:
You can set the "Delay between measurements" appropriately to repeatedly poll the setpoint, and by using the Termination Character property (configured using VISA Configure Serial Port) you don't need to wait between the Write and the Read manually, and instead will always receive a "full" message.
Take care if your messages end in \r\n, you'll need to set the term char to the last character (\n, the default) and then trim the trailing \r character. LabVIEW has a VI that will do that for you here: Trim Whitespace.vi
04-21-2021 09:46 PM
Thank you both for the replies! That video was very informative, seems like I was doing everything he said not to do 😫 by using the simple serial example. I think I'm on the right track now, I'll know more when I'm back at my computer. I attached some screenshots from the instrument manuals. Seems like I can use \r or \n as terminators if I'm understanding correctly. Two quick follow up questions...because I always seem to have more.
1. How do make sure you are reading more than enough bytes?
2. What was causing the instrument to stop responding?
Appreciate all the help.
04-22-2021 07:34 AM
You were probably violating the requirement not to request data more than 20 times per second.
Although you had a 50 msec delay in there, that means you were attempting to request data pretty much right at 20 times per second. With the other flaws of the program like the Run Continuously and the Bytes at Port, I can see how being right at that borderline would eventually cause the device to choke on the serial communication.
04-22-2021 09:09 AM
@Nadweb wrote:
1. How do make sure you are reading more than enough bytes?
Basically, pick a bigger number if you're not sure 🙂 the cost in RAM is unlikely to prohibitive except in specific circumstances, and if you're writing code for that sort of device you probably know already!
I'd take 100, 1000, whatever... (Although here you're expecting 5 or so I think, so 20-50 is probably more than enough).
Regarding the manual and crashing out, it says you should not send anything for 50ms after a transmission. I guess that the message has some sending time, so the transmission of the last in iteration 'i' and the first in iteration 'i+1' could be closer than 50ms in some circumstances (depending on e.g. Windows scheduling).
04-22-2021 10:20 AM
@cbutcher wrote:
@Nadweb wrote:
1. How do make sure you are reading more than enough bytes?
Basically, pick a bigger number if you're not sure 🙂 the cost in RAM is unlikely to prohibitive except in specific circumstances, and if you're writing code for that sort of device you probably know already!
I'd take 100, 1000, whatever... (Although here you're expecting 5 or so I think, so 20-50 is probably more than enough).
Regarding the manual and crashing out, it says you should not send anything for 50ms after a transmission. I guess that the message has some sending time, so the transmission of the last in iteration 'i' and the first in iteration 'i+1' could be closer than 50ms in some circumstances (depending on e.g. Windows scheduling).
I did hear something somewhere a long time ago that said that this isn't actually a buffer allocation; it just tells the read how many bytes to be expecting.
04-22-2021 10:23 AM
@billko wrote:
@cbutcher wrote:
@Nadweb wrote:
1. How do make sure you are reading more than enough bytes?
Basically, pick a bigger number if you're not sure 🙂 the cost in RAM is unlikely to prohibitive except in specific circumstances, and if you're writing code for that sort of device you probably know already!
I'd take 100, 1000, whatever...
I did hear something somewhere a long time ago that said that this isn't actually a buffer allocation; it just tells the read how many bytes to be expecting.
Does the allocation happen bit by bit then? Or more accurately (since I guess "bit" is ambiguous in the context of memory allocations - oops) in small chunks repeatedly being allocated and concatenated/moved?
04-22-2021 11:35 AM
@cbutcher wrote:
@billko wrote:
@cbutcher wrote:
@Nadweb wrote:
1. How do make sure you are reading more than enough bytes?
Basically, pick a bigger number if you're not sure 🙂 the cost in RAM is unlikely to prohibitive except in specific circumstances, and if you're writing code for that sort of device you probably know already!
I'd take 100, 1000, whatever...
I did hear something somewhere a long time ago that said that this isn't actually a buffer allocation; it just tells the read how many bytes to be expecting.
Does the allocation happen bit by bit then? Or more accurately (since I guess "bit" is ambiguous in the context of memory allocations - oops) in small chunks repeatedly being allocated and concatenated/moved?
I'm guessing it just keeps track of how many bytes to expect so it knows when to timeout.
04-22-2021 11:48 AM
There's a low level function called "VISA Set I/O Buffer Size" where the default (if unwired) size seems to be 4096. Maybe that's the standard one-time initial allocation when you don't explicitly request otherwise? (Edit: nope. The help says that if you don't set it explicitly, the size will depend on other unspecified aspects of VISA and OS config. But it seems to be a one-time thing during init or open).
As far as I could tell, there doesn't appear to be a corresponding property node to query or set.
-Kevin P