LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Help with Calibration After Connecting via VISA

 

Hey everyone,

I’m currently using VISA to connect to a machine, and now I’ve reached the point where I need to handle the calibration. However, I’m not exactly sure how to approach this part in LabVIEW.

Has anyone done calibration after setting up a VISA connection? Are there any recommended methods or examples that could help guide me through the process? I’d love to hear your thoughts or any useful tips on how I should get started.

Thanks a lot!

0 Kudos
Message 1 of 6
(316 Views)

Visa is just a communication method. 

The calibration of the machine is probably done by sending /receiving commands to/from the instrument and follow the procedure defined by the vendor. 

Once you know the commands, use Visa to send and receive the messages. 

 

0 Kudos
Message 2 of 6
(303 Views)

Nowhere close to enough information to give any real advice.

 

What device are you trying to communicate with? What communication bus are you using (RS-232, USB, Ethernet, etc.)? What do you mean by "calibration"? Is there a procedure out there for doing this calibration?


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 3 of 6
(302 Views)

I’ve got a bit of a challenge here. I’ve already made a VI that connects to a display, so I had a good idea of how to use and interact with it. But now, I’m working with a machine that doesn’t have a display at all. Instead, I’ll be using another auxiliary device to compare the values from the machine with the auxiliary device's readings.

The problem is, I need to figure out how to calibrate the force being shown by the machine, and I’m not sure where to start. Ideally, I want to make sure the values shown on the auxiliary device match what should be displayed on the machine (if it had a display). It seems like the machine is currently uncalibrated, and I’m a bit lost on how to proceed with this calibration.

 

And yes, I am using the RS-300, but I don't think I'll use it for calibration.

Just to clarify, I just want to understand what I should do when calibrating this new machine.

0 Kudos
Message 4 of 6
(282 Views)

for calibration, you need to have a known value ( your auxiliary device) and compare with the readings the device you want to calibrate. 

have a couple of measurements (minimum of 2) then calculate the coefficients. 

 

Assuming that the signal is linear, you can use a linear regression algorithm to get your calibration factors. 

 

Then you apply your calibration factors to the readings of your device. 

 

calibrated readings y = a x +b

 

where 

y is the calibrated value

a is the angular coeficient

b linear coeficient

x is the uncalibrated value. 

0 Kudos
Message 5 of 6
(268 Views)

Yes, I already understand the first part, but the problem is the second part.

 

I don’t know how to make the machine's information adjust to get closer to the correct value. From what I've read, the issue is that as the load increases (ton by ton), the absolute error in the graph changes.

0 Kudos
Message 6 of 6
(257 Views)