From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.
We appreciate your patience as we improve our online experience.
08-02-2019 05:31 AM
Hi,
I cannot communicate with an a Chroma LCR Meter (Model 11021) using VISA yet I can communicate with the instrument using MAX or Realterm via an RS232 serial interface.
When I send the "*IDN?\n" SCPI command to the instrument I get a response in both MAX and Realterm but I'm getting no bytes returned when I create a VI using VISA. I have attached a screenshot of the VI below. I have even inserted a delay between the write VISA function and the Read VISA function but this made no difference.
Any help would be much appreciated.
Thanks,
Paul
Solved! Go to Solution.
08-02-2019 08:27 AM
Do you have the "'\' Code Display" option enabled for the string constant?
I would also recommend using NI I/O Trace when sending your commands in MAX and comparing what is being sent in LabVIEW to see if any other functionality is different. It could be that one of your initialization steps is different between the two.
08-02-2019 09:43 AM
Hi,
thank you for the reply.
I have opened N.I I/O Trace and copied a screenshot of the NI I/O trace below. I have highlighted in yellow the two sets of writes. You will notice that the instrument replies with the "Chroma...." string on line 2 when using MAX. However nothing is returned in the VI on line 30.
What do you think is the reason for this?
Thanks,
Paul
08-02-2019 09:52 AM
08-02-2019 10:16 AM
Yes, I hadn't that enabled. I did enable it and the writes are identical but it still didn't work.
I then removed the Visa Open VI and ran it started to work all of a sudden with the string constant enabled. I went modifying the VI to see which of the two changes made the difference and I cannot get it to work again.
08-02-2019 10:29 AM
I manged to get it working only after:-
1) Enabling the '\' codes display.
2) Putting a delay between the Visa Write and Visa Read section. (see attached).
Why do I need to enable the '\' codes display to get it to work? Is labview altering the actual string that is sent depending on if this is enabled or not?
P.s. It actually works with or without the Visa Open present or not present. I thought you needed to open a Visa session everytime you setup a comms channel.
08-02-2019 10:45 AM
08-02-2019 11:48 AM
Thanks again for the reply.
One last query, why do I have to put a delay in between the Write and Read since all other examples I have come across don't require this?
Also, I don't know if MAX waits a certain amount of time before it reads after writing?
This also works in RealTerm (which is a serial comm package) without any issues.
08-02-2019 12:01 PM
08-02-2019 12:33 PM
@PaulJM wrote:
This also works in RealTerm (which is a serial comm package) without any issues.
Because RealTerm is programmed differently than what you are trying to do here. It is a terminal program. Is Writes out bytes as fast as you type them in, it Reads bytes as fast as they come in doing this in continuously running loops.. It might be only grabbing 1 or 2 bytes at a time as they come in.
You don't have a terminal program in your VI. You have a communication protocol where you Write a command in its entirety in a single Write. Then you have a single Read that will Read all that has been received in the buffer.
You've made two major mistakes in your VI with respect to serial communication, and it took too many messages for you to fix it that I'm not sure you completely understand why your original code didn't work.
1. You didn't understand the difference between normal display and \code display. When you type in "\n" in normal display you are sending 2 different bytes: a backslash followed by the letter "n". The device doesn't no what that means and it doesn't mean the end of the command. So the device waits for the rest of the command that never comes, and things time out. In \code display "\n" represents a single byte, the linefeed character. The device's protocol says that is the termination character, so in know when it sees that the command is done and it can proceed to process the command you sent it.
2. Bytes at Port. Too many examples of NI use that node and it is the wrong thing to use 99% of the time. It checks how many bytes have been received. Since you check it immediately after writing the command, you didn't give the device any time to read the command, process it, and sent out its complete response. Usually the value will be 0, and you proceed to read 0 bytes. So when you put in the wait. (arbitrarily determined as 1 second in this case), you've provided enough time for the device to do its jobs. The correct thing to do is if the device sends a termination character in its response (probably a linefeed since that is what it is asking for in the command you wrote), you configure the serial port to Enable the termination character and make it a linefeed. The proceed to not put in an arbitrary wait, just request X number of bytes from the read where X is larger than the longest message you ever expect to receive. The VISA Read will stop and return the full message up to and including the termination character as soon as that linefeed byte as arrived. Now instead of waiting a full second for a message, if it arrives sooner, you get the message sooner.