From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
04-22-2021 11:13 PM
I appreciate all the help everyone! Things seem to be working better now after I made those changes and ending on the proper termination character. Now I'll have to implement it as a sub vi into my state machine and see how everything works.
04-23-2021 06:02 AM - edited 04-23-2021 06:07 AM
@cbutcher wrote:
@billko wrote:
@cbutcher wrote:
@Nadweb wrote:
1. How do make sure you are reading more than enough bytes?
Basically, pick a bigger number if you're not sure 🙂 the cost in RAM is unlikely to prohibitive except in specific circumstances, and if you're writing code for that sort of device you probably know already!
I'd take 100, 1000, whatever...
I did hear something somewhere a long time ago that said that this isn't actually a buffer allocation; it just tells the read how many bytes to be expecting.
Does the allocation happen bit by bit then? Or more accurately (since I guess "bit" is ambiguous in the context of memory allocations - oops) in small chunks repeatedly being allocated and concatenated/moved?
The allocation is a lot more smart than that. The node starts internally with a default buffer that is a certain amount such as 1000 or a few 1000 bytes or the specified number of bytes if that is lower.
The read function then loops and checks a number of things in each iteration. First it checks If the underlying driver reported an error, if so it aborts and returns with that error.
Then it checks how much data is available from the driver and fills it in the buffer up to the amount requested. If the buffer is to small for the new data, its size is typically doubled. If the requested amount has been retrieved, it then returns successfully, adjusting the buffer to the actually retrieved data.
If not, the timeout is checked and if it has elapsed, the function returns with a timeout error. Otherwise it schedules a wait for the remainder of the timeout period and this wait can be interrupted by driver events when an error occurred or data arrives. Then it loops back.
04-23-2021 10:54 AM
I may have spoken a bit too soon. Although the changes seem to have made the connection more stable, because the instrument doesn't drop communication anymore, I'm getting no data every other read. My code looks exactly like the example cbutcher posted, but I have a 500 ms wait. My data reads 300, 0, 300, 0, 300, 0, 300, 0, etc. I tried raising the Baud rate, changing between LF and CR terminators, even with a 1000 ms wait I get the same result. If I add in a Visa Clear function after the write and read, it seems to be better. But I shouldn't need the Visa Clear for this to work properly.
04-23-2021 11:19 AM - edited 04-23-2021 11:24 AM
Right click on the Response string indicator on the front panel and enable '\' Codes Display.
Then tell us what you specifically see in the case when you read the 300 value and what you see in this indicator in the 0 value case. Your instrument most likely sends more than just your "300\r" measurement result. Most likely something like OK\r300\r or similar! So you would have to use two VISA Read after each other to read both terminated string elements.
Since you have forgotten to tell us what instrument it is, we can't go and check ourselves and try to find the instrument manual. 😀
04-23-2021 11:34 AM
So when I get the correct reading, I see a 300.3012\n when I get just a zero, I see only \n. Attached are a few photos of the VI and settings on the Lakeshore 340.
04-23-2021 06:48 PM
It seems like every time I get a "0" or \n response, that command might still be stored in the buffer, because when I switch commands using the same VI, I'll get responses from the old command back for awhile until eventually it switches to responding to the new command. For example, I changed to *IDN?\n and was still getting temperature responses for awhile after making the switch. I'm baffled why this isn't working properly.
04-24-2021 03:39 AM - edited 04-24-2021 03:45 AM
That’s because your device seems to send back 300.123\n\n for some reason. It’s not clear why and the manual says nothing about this. Check the instrument settings for what termination character you have configured under the RS-232 and/or Communication settings. This should be set to a single \n.
If that is what is set you will have to call the VISA Read twice in every loop, as I mentioned in an earlier post! One Read return the response string and one \n, the second Read removes the superfluous \n from the read queue.
You might want to configure the termination explicitly as the first step after opening the instrument to make sure your device uses the method your driver was provrammed for.
04-24-2021 12:08 PM
Thank you for your help Rolfk!
I will have to try adding an extra read function to see what I can figure out. The problem I can see with this, is that if the instrument responds properly for a few iterations (no \n\n) which does happen occasionally, it could throw off which read function is reading the data and which read function is reading the \n. I'll have to figure out a way to compare both of the read function outputs to decide which one is correct and only display that output.
I double checked the 340 settings and made sure that its terminator in the serial settings is set to LF. If the instrument wants to send two terminators, I could try CR LF or LF CR as a setting option.
04-25-2021 12:40 AM
@Nadweb wrote:
I double checked the 340 settings and made sure that its terminator in the serial settings is set to LF. If the instrument wants to send two terminators, I could try CR LF or LF CR as a setting option.
You can only have one termination character. If two options exist, and the device sometimes sends one, and sometimes sends the other, then you have to handle this yourself (and ideally, complain loudly and repeatedly to the manufacturer...).
If on the other hand, as seems to be the case for you, it might send two characters at the end of a message (e.g. \r\n), terminate on the \n (last char) and manually handle only the trailing \r (which is much simpler than avoiding term chars entirely as in the first case).
If you're getting extra arbitrary-seeming \n, try determine if perhaps the messaging protocol includes any details about why this might happen (maybe it sends some message on a schedule, and that's getting in your way, and for some reason it's empty, etc?). I'm taking a look at the manual now.
04-25-2021 01:06 AM
In "Capture01.png", the message you're sending shows "KRDG? A\n" (presumably in \ view, although we can't see that) but there's a little diagonal arrow at the bottom right.
This arrow indicates there are parts of the string not visible.
I suspect you have some extra bits below that might have been accidentally left there from previous calls? (Or that you're not in \ view, and you have end of line characters after the "\n" string, which in \ view becomes "\\n" (i.e. backslash, then literal n)).