LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Lab View DAQmx Calibration

Solved!
Go to solution

Hi everyone, hope you all don't mind if I ask a quick question here.

 

I am conducting a project as part of my final year at university, and as part of the project I have to create code using LabVIEW wherein pressure will be displayed as an output on a chart from the pressure sensing equipment I have set up. I have written the code and created a task using the DAQ assist where a voltage is read from my pressure sensor and is automatically converted to Pascals through adjustments I have made to the scaling.

 

I am however having trouble with the calibration of my task. I have been using the tutorial shown in the link below to attempt to work my way through the process however I am finding that the explanations are too general for me to understand because this is my first experience using the software.

 

https://knowledge.ni.com/KnowledgeArticleDetails?id=kA00Z000000P80ZSAS

 

Specifically the questions I have relating to the process are as follows:

 

The first question is in relation to part 5 of the tutorial linked above. Are the samples to be averaged and the rate just the same settings as those I have chosen in my timing settings? If not what do I base these figures off of?

 

My next question is in relation to part 7 of the linked tutorial. When selecting a reference value in relation to the uncalibrated value, should I be selecting reference values that coincide with the maximum and minimum readable values of the pressure sensor or base the reference values off of another baseline?

 

 

I can offer more information on my setup if it is needed. I do apologise for troubling you all with my questions and any help or pointers anyone can offer will be greatly appreciated.

 

Thanks,

 

Corey McPhail

0 Kudos
Message 1 of 3
(3,039 Views)
Solution
Accepted by topic author coreym67

Hi Corey,

 

I am not the most familiar with the calibration wizard, but hopefully I can help with the answers to your questions. 

 

For part 5, the number of samples is related to your sample rate by the Nyquist theorem, so you want to take a least 2x the samples as the rate, i.e 100 kS/s means you want 200 kS. However, it is suggested that you take 10x the samples, so for 100 kS/s you set it to 1 MS. The wizard is essentially going to use these to set up the DAQ task an take measurements, so it might be best to make your sample rate and number of samples be equivalent to your application.

 

Part 7 you are pretty much correct. The value you enter is the expected value of the measurement, and the wizard compares that to the value it is getting. So if you has a simple DC power supply and set the supply to 5 V, that would be your reference. The Uncalibrated column could then have a value of say 4.7 V, and so it will attempt to correct that .3 V

 

I would suggest trying it out a couple of times and getting familiar with the outcomes. 

 

Respectfully,

Ben H.
Systems Engineer
National Instruments
Message 2 of 3
(3,021 Views)

Hi Ben,

 

Thank you for the reply. I will give it a try at when I next get access to the computer with LabVIEW to try and work through the calibration. That said your explanation has helped me come to a much more solid understanding of the calibration.

 

Thanks again for your help.

 

Kind Regards

 

Corey McPhail

0 Kudos
Message 3 of 3
(3,014 Views)