LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

AI Absolute Accuracy of the PCIe-6353

Hello, I'm using a divide by 100 voltage divider to reduce the voltage being fed to the AI's of a PCIe 6353 Daq to millivolts. As shown below, I'm using a differential scale of +/- 40 volts.

 

 

Tonedope_1-1728484874455.png

My question is, how do I utilize a "nominal range" of +/- 1 Vdc which will yield an absolute accuracy of 180 uV per the table below? Is it as simple as changing my custom scale to +/- 1 volt?

If not, how is the AI range of the PCIe 6353 Daq. selected?

 

Tonedope_2-1728484920109.png

 

Lastly, how much drift can I expect as it pertains to the accuracy of the PCIe Daq? While we routinely "calibrate" the daq's response using the flow below, I've been told that this daq has been in use for over 5 years and never been officially calibrated since it was purchased.

 

Tonedope_3-1728486538409.png

 

Thanks in advance,

 

Tone

 

0 Kudos
Message 1 of 7
(401 Views)

I will not restate information that is contained in the device specification. I WILL say that you need to RTFM! Then we will be happy to assist you with interpreting specific information about your system's capabilities.

What links have you followed to the appropriate documents? What problems have you had translating that information into answers?

 

I will say clearly!!  If your DAQ device has not been compared to an NIST  traceable standard (been calibrated) it is NOT Calibrated!  The output may as well be in ADC Counts per Flying Doughnut holes!

 

Get the device to a calibration lab!


"Should be" isn't "Is" -Jay
0 Kudos
Message 2 of 7
(344 Views)

What would be more constructive in your observation is to provide an answer to the first part of the question, which is how to access the AI Absolute Accuracy of the PCIe-6353. If you don't know, simply do not reply. You provide no help to those of us looking for solutions when you needlessly attack an honest question.

 

Have a great day.

0 Kudos
Message 3 of 7
(324 Views)

Another item to consider is just how accurate is your divide by 100 voltage divider- that could be a huge contributor to your inaccuracy budget.

 

-AK2DM

~~~~~~~~~~~~~~~~~~~~~~~~~~
"It’s the questions that drive us.”
~~~~~~~~~~~~~~~~~~~~~~~~~~
0 Kudos
Message 4 of 7
(313 Views)

@Tonedope wrote:

What would be more constructive in your observation is to provide an answer to the first part of the question, which is how to access the AI Absolute Accuracy of the PCIe-6353. If you don't know, simply do not reply. You provide no help to those of us looking for solutions when you needlessly attack an honest question.

 

Have a great day.


Access that information in the product specifications.   I tried to say RTFM politely. 

 

BUT WAIT! You really want someone to RTFM for you!  Yet, if anyone did RTFM they would tell you that, without the device being seen at a calibration lab.in over 5 years, there is NO absolute accuracy. 


"Should be" isn't "Is" -Jay
0 Kudos
Message 5 of 7
(295 Views)

Sorry this doesn't answer your question directly, but the 6353 contains an internal 5V source, I believe. That can be used to do a self-calibration. Look in the DAQmx Advanced Calibration Palette to find the VI. This may help improve your accuracy without an external calibration. 

Lastly, you can do an Allan Variance measurement. This won't tell you accuracy, but will tell you important noise characteristics. For example, if you only assume you have white noise, then theoretically you can average forever and improve your measurement. However, there is always some low frequency drift. The Allan Variance measurement will tell you the optimal averaging time, that is, the time where an average measurement gives you the best answer. After that time, low frequency noise takes over and your average measurement becomes worse due to this noise. 

Message 6 of 7
(279 Views)

The max and min for the DAQmx Create Channel are in scaled units. Given your linear scale, you could set the range to ± 50.001 .. 100 (V) to set the range. You can use the DAQmx Channel property `Analog Input:General Properties:Advanced:Range:High` to read back the range in the native units of the device (aka Volts unscaled by your linear scale).

This post describes setting min and max when using scales:
https://forums.ni.com/t5/LabVIEW/DAQmx-custom-scale-how-do-you-set-MIN-and-MAX-values/td-p/1277028

DAQmx chooses the closest bigger range that accommodates your max and min.

Drift specifications are provided in terms of temperature, not time. The device is outside the recommended calibration interval (2 years). 

dsbNI_0-1728615502895.png

If you want confidence that the device is performing within (ai absolute accuracy) specifications, calibrate the device at an accredited calibration lab.

 

Doug
NI Sound and Vibration
Message 7 of 7
(263 Views)