From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Problem communicating with a Chroma LCR Meter using VISA

Solved!
Go to solution

Hi,

 

I cannot communicate with an a Chroma LCR Meter (Model 11021) using VISA yet I can communicate with the instrument using MAX or Realterm via an RS232 serial interface. 

 

When I send the "*IDN?\n" SCPI command to the instrument I get a response in both MAX and Realterm but I'm getting no bytes returned when I create a VI using VISA. I have attached a screenshot of the VI below. I have even inserted a delay between the write VISA function and the Read VISA function but this made no difference.

 

Any help would be much appreciated.

 

Thanks,

Paul

 

0 Kudos
Message 1 of 13
(2,660 Views)

Do you have the "'\' Code Display" option enabled for the string constant?

 

I would also recommend using NI I/O Trace when sending your commands in MAX and comparing what is being sent in LabVIEW to see if any other functionality is different. It could be that one of your initialization steps is different between the two.

0 Kudos
Message 2 of 13
(2,606 Views)

Hi,

 

thank you for the reply.

 

I have opened N.I I/O Trace and copied a screenshot of the NI I/O trace below. I have highlighted in yellow the two sets of writes. You will notice that the instrument replies with the "Chroma...." string on line 2 when using MAX. However nothing is returned in the VI on line 30.

 

What do you think is the reason for this? 

 

Thanks,

Paul

0 Kudos
Message 3 of 13
(2,586 Views)
If you look at the two writes (lines 1 and 28) you will see that they are different. Did you verify that the '\' Codes Display option is enabled for your string constant?
0 Kudos
Message 4 of 13
(2,579 Views)

Yes, I hadn't that enabled. I did enable it and the writes are identical but it still didn't work.

 

I then removed the Visa Open VI and ran it started to work all of a sudden with the string constant enabled. I went modifying the VI to see which of the two changes made the difference and I cannot get it to work again.

0 Kudos
Message 5 of 13
(2,576 Views)

I manged to get it working only after:-

 

1) Enabling the '\' codes display.

2) Putting a delay between the Visa Write and Visa Read section. (see attached).

 

Why do I need to enable the '\' codes display to get it to work? Is labview altering the actual string that is sent depending on if this is enabled or not? 

 

P.s. It actually works with or without the Visa Open present or not present. I thought you needed to open a Visa session everytime you setup a comms channel. 

0 Kudos
Message 6 of 13
(2,571 Views)
Here is a knowledge article about '\' codes in LabVIEW. https://knowledge.ni.com/KnowledgeArticleDetails?id=kA00Z000000PAW3SAO&l=en-US VISA Open is not necessary, the VISA Property Node at the beginning of your VI will open a session to the resource for you. In my opinion it is a good idea to still use VISA Open (but I would put it before your property node) as an explicit reminder that you are opening a session for that resource but that's personal preference. However, make sure you keep the VISA Close or the session will stay open indefinitely (or until you close LabVIEW if you have the automatically close VISA sessions enabled).
0 Kudos
Message 7 of 13
(2,564 Views)

Thanks again for the reply.

 

One last query, why do I have to put a delay in between the Write and Read since all other examples I have come across don't require this?

 

Also, I don't know if MAX waits a certain amount of time before it reads after writing?

 

This also works  in RealTerm (which is a serial comm package) without any issues. 

0 Kudos
Message 8 of 13
(2,557 Views)
It is because you are checking the bytes available at the port and using that as the number of bytes to read. Without a delay the bytes at port will return 0 so the VISA Read will attempt to read 0 bytes and is essentially skipped. Rather than put a 1 second delay, I would recommend enabling the termination character in your VISA Property node at the start of the VI and set the termination character to 13 (\n) and then set the bytes to read input to a number larger than you would expect from your response message. The VISA Read then will return when the termination character is reached rather than reading a specified number of bytes. This is how the operation is performed in MAX and I believe the default bytes to read is set to 1024.
0 Kudos
Message 9 of 13
(2,549 Views)
Solution
Accepted by PaulJM

@PaulJM wrote:

 

This also works  in RealTerm (which is a serial comm package) without any issues. 


Because RealTerm is programmed differently than what you are trying to do here.  It is a terminal program.  Is Writes out bytes as fast as you type them in,  it Reads bytes as fast as they come in doing this in continuously running loops..  It might be only grabbing 1 or 2 bytes at a time as they come in.

 

You don't have a terminal program in your VI.  You have a communication protocol where you Write a command in its entirety in a single Write.  Then you have a single Read  that will Read all that has been received in the buffer.

 

You've made two major mistakes in your VI with respect to serial communication, and it took too many messages for you to fix it that I'm not sure you completely understand why your original code didn't work.

 

1.  You didn't understand the difference between normal display and \code display.  When you type in "\n" in normal display you are sending 2 different bytes:  a backslash followed by the letter "n".  The device doesn't no what that means and it doesn't mean the end of the command.  So the device waits for the rest of the command that never comes, and things time out.  In \code display "\n" represents a single byte, the linefeed character.  The device's protocol says that is the termination character, so in know when it sees that the command is done and it can proceed to process the command you sent it.

 

2.  Bytes at Port.  Too many examples of NI use that node and it is the wrong thing to use 99% of the time.  It checks how many bytes have been received.  Since you check it immediately after writing the command, you didn't give the device any time to read the command, process it, and sent out its complete response.  Usually the value will be 0, and you proceed to read 0 bytes.  So when you put in the wait. (arbitrarily determined as 1 second in this case), you've provided enough time for the device to do its jobs.   The correct thing to do is if the device sends a termination character in its response (probably a linefeed since that is what it is asking for in the command you wrote), you configure the serial port to Enable the termination character and make it a linefeed.  The proceed to not put in an arbitrary wait, just request X number of bytes from the read where X is larger than the longest message you ever expect to receive.  The VISA Read will stop and return the full message up to and including the termination character as soon as that linefeed byte as arrived.  Now instead of waiting a full second for a message, if it arrives sooner, you get the message sooner.

 

 

0 Kudos
Message 10 of 13
(2,542 Views)