From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Serial Communication fails after a minute or so of running

Solved!
Go to solution

I appreciate all the help everyone! Things seem to be working better now after I made those changes and ending on the proper termination character. Now I'll have to implement it as a sub vi into my state machine and see how everything works.

Message 11 of 25
(821 Views)

@cbutcher wrote:

@billko wrote:

@cbutcher wrote:

@Nadweb wrote:

1. How do make sure you are reading more than enough bytes?


Basically, pick a bigger number if you're not sure 🙂 the cost in RAM is unlikely to prohibitive except in specific circumstances, and if you're writing code for that sort of device you probably know already!

I'd take 100, 1000, whatever...


I did hear something somewhere a long time ago that said that this isn't actually a buffer allocation; it just tells the read how many bytes to be expecting.


Does the allocation happen bit by bit then? Or more accurately (since I guess "bit" is ambiguous in the context of memory allocations - oops) in small chunks repeatedly being allocated and concatenated/moved? 


The allocation is a lot more smart than that. The node starts internally with a default buffer that is a certain amount such as 1000 or a few 1000 bytes or the specified number of bytes if that is lower.

The read function then loops and checks a number of things in each iteration. First it checks If the underlying driver reported an error, if so it aborts and returns with that error.

Then it checks how much data is available from the driver and fills it in the buffer up to the amount requested. If the buffer is to small for the new data, its size is typically doubled. If the requested amount has been retrieved, it then returns successfully, adjusting the buffer to the actually retrieved data.

If not, the timeout is checked and if it has elapsed, the function returns with a timeout error. Otherwise it schedules a wait for the remainder of the timeout period and this wait can be interrupted by driver events when an error occurred or data arrives. Then it loops back.

Rolf Kalbermatter
My Blog
Message 12 of 25
(810 Views)

I may have spoken a bit too soon. Although the changes seem to have made the connection more stable, because the instrument doesn't drop communication anymore, I'm getting no data every other read. My code looks exactly like the example 

0 Kudos
Message 13 of 25
(791 Views)

Right click on the Response string indicator on the front panel and enable '\' Codes Display.

 

Then tell us what you specifically see in the case when you read the 300 value and what you see in this indicator in the 0 value case. Your instrument most likely sends more than just your "300\r" measurement result.  Most likely something like OK\r300\r or similar! So you would have to use two VISA Read after each other to read both terminated string elements.

 

Since you have forgotten to tell us what instrument it is, we can't go and check ourselves and try to find the instrument manual. 😀

Rolf Kalbermatter
My Blog
0 Kudos
Message 14 of 25
(781 Views)

So when I get the correct reading, I see a 300.3012\n when I get just a zero, I see only \n. Attached are a few photos of the VI and settings on the Lakeshore 340.

Download All
0 Kudos
Message 15 of 25
(775 Views)

It seems like every time I get a "0" or \n response, that command might still be stored in the buffer, because when I switch commands using the same VI, I'll get responses from the old command back for awhile until eventually it switches to responding to the new command. For example, I changed to *IDN?\n and was still getting temperature responses for awhile after making the switch. I'm baffled why this isn't working properly.  

0 Kudos
Message 16 of 25
(708 Views)

That’s because your device seems to send back 300.123\n\n for some reason. It’s not clear why and the manual says nothing about this. Check the instrument settings for what termination character you have configured under the RS-232 and/or Communication settings. This should be set to a single \n.

 

If that is what is set you will have to call the VISA Read twice in every loop, as I mentioned in an earlier post! One Read return the response string and one \n, the second Read removes the superfluous \n from the read queue.

 

You might want to configure the termination explicitly as the first step after opening the instrument to make sure your device uses the method your driver was provrammed for.

Rolf Kalbermatter
My Blog
Message 17 of 25
(681 Views)

Thank you for your help Rolfk! 

 

I will have to try adding an extra read function to see what I can figure out. The problem I can see with this, is that if the instrument responds properly for a few iterations (no \n\n) which does happen occasionally, it could throw off which read function is reading the data and which read function is reading the \n. I'll have to figure out a way to compare both of the read function outputs to decide which one is correct and only display that output.

 

I double checked the 340 settings and made sure that its terminator in the serial settings is set to LF. If the instrument wants to send two terminators, I could try CR LF or LF CR as a setting option.

0 Kudos
Message 18 of 25
(658 Views)

@Nadweb wrote:

I double checked the 340 settings and made sure that its terminator in the serial settings is set to LF. If the instrument wants to send two terminators, I could try CR LF or LF CR as a setting option.


You can only have one termination character. If two options exist, and the device sometimes sends one, and sometimes sends the other, then you have to handle this yourself (and ideally, complain loudly and repeatedly to the manufacturer...).

 

If on the other hand, as seems to be the case for you, it might send two characters at the end of a message (e.g. \r\n), terminate on the \n (last char) and manually handle only the trailing \r (which is much simpler than avoiding term chars entirely as in the first case).

 

If you're getting extra arbitrary-seeming \n, try determine if perhaps the messaging protocol includes any details about why this might happen (maybe it sends some message on a schedule, and that's getting in your way, and for some reason it's empty, etc?). I'm taking a look at the manual now.


GCentral
Message 19 of 25
(638 Views)
Solution
Accepted by topic author Nadweb

In "Capture01.png", the message you're sending shows "KRDG? A\n" (presumably in \ view, although we can't see that) but there's a little diagonal arrow at the bottom right.

This arrow indicates there are parts of the string not visible.

 

I suspect you have some extra bits below that might have been accidentally left there from previous calls? (Or that you're not in \ view, and you have end of line characters after the "\n" string, which in \ view becomes "\\n" (i.e. backslash, then literal n)).


GCentral
Message 20 of 25
(633 Views)