10-09-2024 10:14 AM
Hello, I'm using a divide by 100 voltage divider to reduce the voltage being fed to the AI's of a PCIe 6353 Daq to millivolts. As shown below, I'm using a differential scale of +/- 40 volts.
My question is, how do I utilize a "nominal range" of +/- 1 Vdc which will yield an absolute accuracy of 180 uV per the table below? Is it as simple as changing my custom scale to +/- 1 volt?
If not, how is the AI range of the PCIe 6353 Daq. selected?
Lastly, how much drift can I expect as it pertains to the accuracy of the PCIe Daq? While we routinely "calibrate" the daq's response using the flow below, I've been told that this daq has been in use for over 5 years and never been officially calibrated since it was purchased.
Thanks in advance,
Tone
10-09-2024 07:50 PM - edited 10-09-2024 07:58 PM
I will not restate information that is contained in the device specification. I WILL say that you need to RTFM! Then we will be happy to assist you with interpreting specific information about your system's capabilities.
What links have you followed to the appropriate documents? What problems have you had translating that information into answers?
I will say clearly!! If your DAQ device has not been compared to an NIST traceable standard (been calibrated) it is NOT Calibrated! The output may as well be in ADC Counts per Flying Doughnut holes!
Get the device to a calibration lab!
10-10-2024 08:32 AM
What would be more constructive in your observation is to provide an answer to the first part of the question, which is how to access the AI Absolute Accuracy of the PCIe-6353. If you don't know, simply do not reply. You provide no help to those of us looking for solutions when you needlessly attack an honest question.
Have a great day.
10-10-2024 09:25 AM
Another item to consider is just how accurate is your divide by 100 voltage divider- that could be a huge contributor to your inaccuracy budget.
-AK2DM
10-10-2024 10:28 AM - edited 10-10-2024 10:40 AM
@Tonedope wrote:
What would be more constructive in your observation is to provide an answer to the first part of the question, which is how to access the AI Absolute Accuracy of the PCIe-6353. If you don't know, simply do not reply. You provide no help to those of us looking for solutions when you needlessly attack an honest question.
Have a great day.
Access that information in the product specifications. I tried to say RTFM politely.
BUT WAIT! You really want someone to RTFM for you! Yet, if anyone did RTFM they would tell you that, without the device being seen at a calibration lab.in over 5 years, there is NO absolute accuracy.
10-10-2024 02:07 PM
Sorry this doesn't answer your question directly, but the 6353 contains an internal 5V source, I believe. That can be used to do a self-calibration. Look in the DAQmx Advanced Calibration Palette to find the VI. This may help improve your accuracy without an external calibration.
Lastly, you can do an Allan Variance measurement. This won't tell you accuracy, but will tell you important noise characteristics. For example, if you only assume you have white noise, then theoretically you can average forever and improve your measurement. However, there is always some low frequency drift. The Allan Variance measurement will tell you the optimal averaging time, that is, the time where an average measurement gives you the best answer. After that time, low frequency noise takes over and your average measurement becomes worse due to this noise.
10-10-2024 10:05 PM
The max and min for the DAQmx Create Channel are in scaled units. Given your linear scale, you could set the range to ± 50.001 .. 100 (V) to set the range. You can use the DAQmx Channel property `Analog Input:General Properties:Advanced:Range:High` to read back the range in the native units of the device (aka Volts unscaled by your linear scale).
This post describes setting min and max when using scales:
https://forums.ni.com/t5/LabVIEW/DAQmx-custom-scale-how-do-you-set-MIN-and-MAX-values/td-p/1277028
DAQmx chooses the closest bigger range that accommodates your max and min.
Drift specifications are provided in terms of temperature, not time. The device is outside the recommended calibration interval (2 years).
If you want confidence that the device is performing within (ai absolute accuracy) specifications, calibrate the device at an accredited calibration lab.