10-10-2024 08:24 AM
Hey everyone,
I’m currently using VISA to connect to a machine, and now I’ve reached the point where I need to handle the calibration. However, I’m not exactly sure how to approach this part in LabVIEW.
Has anyone done calibration after setting up a VISA connection? Are there any recommended methods or examples that could help guide me through the process? I’d love to hear your thoughts or any useful tips on how I should get started.
Thanks a lot!
10-10-2024 08:45 AM
Visa is just a communication method.
The calibration of the machine is probably done by sending /receiving commands to/from the instrument and follow the procedure defined by the vendor.
Once you know the commands, use Visa to send and receive the messages.
10-10-2024 08:48 AM
Nowhere close to enough information to give any real advice.
What device are you trying to communicate with? What communication bus are you using (RS-232, USB, Ethernet, etc.)? What do you mean by "calibration"? Is there a procedure out there for doing this calibration?
10-10-2024 09:12 AM
I’ve got a bit of a challenge here. I’ve already made a VI that connects to a display, so I had a good idea of how to use and interact with it. But now, I’m working with a machine that doesn’t have a display at all. Instead, I’ll be using another auxiliary device to compare the values from the machine with the auxiliary device's readings.
The problem is, I need to figure out how to calibrate the force being shown by the machine, and I’m not sure where to start. Ideally, I want to make sure the values shown on the auxiliary device match what should be displayed on the machine (if it had a display). It seems like the machine is currently uncalibrated, and I’m a bit lost on how to proceed with this calibration.
And yes, I am using the RS-300, but I don't think I'll use it for calibration.
Just to clarify, I just want to understand what I should do when calibrating this new machine.
10-10-2024 10:31 AM
for calibration, you need to have a known value ( your auxiliary device) and compare with the readings the device you want to calibrate.
have a couple of measurements (minimum of 2) then calculate the coefficients.
Assuming that the signal is linear, you can use a linear regression algorithm to get your calibration factors.
Then you apply your calibration factors to the readings of your device.
calibrated readings y = a x +b
where
y is the calibrated value
a is the angular coeficient
b linear coeficient
x is the uncalibrated value.
10-10-2024 01:00 PM
Yes, I already understand the first part, but the problem is the second part.
I don’t know how to make the machine's information adjust to get closer to the correct value. From what I've read, the issue is that as the load increases (ton by ton), the absolute error in the graph changes.