Here's the short version of my question with probably a few built in assumptions:
I'm using an NI PCIe-6321 DAQ. I used a signal generator to send a 10 V square wave and measured through an AI channel on the DAQ. The result shows the signal going from very near zero (~0.07 V) to 10.75 V, not 10 V. Why?
Longer version: I took measurements using a torque transducer with a Data Translation DAQ and a different computer a while back. The measurements matched theory well, so I trust them. Fast forward, I repeat with this new NI DAQ and a new computer, and there's about a factor of two difference in the results. Since I have theory to compare to, I don't trust the NI results. I simultaneously recorded on the old set up just as above and the DT DAQ shows a ~0.07 to 10 V square wave. In other words, same minimum, different maximum. What's going on?
Is the channel setup in Differential mode when it should be Referenced Single Ended? Are the signal returns connected to AI GND?
It is set as single ended and the signal includes a ground wire going to AI GND. I also ran the self-calibration in NI MAX just for good measure. I also recorded the DAQ's own 5 V signal and got an average of about 5.025 V, so pretty good. But I still see the same problem with the 10 V square wave signal.
When I put a multimeter right on the signal generator, it is in fact reading 10.75 V, so it appears that the NI DAQ is correct. What's weirder is that when I put the multimeter on the screw terminal of the DT DAQ, it reads 10.73 V. My guess is that this DAQ is saturating at 10 V and that's why the two DAQs give different results. If this is true, that leaves a larger question that I'm still working on:
What in my set up change (see mention of torque transducer above) caused me to get different results by about a factor of two? Keep in mind, I can compare to theory, and the theory matches the results from the DT DAQ setup, not the NI DAQ setup. If anyone thinks further details on the two setups would help, let me know. Thanks.
What is the output impedance of your generator?
What is the input impedance of the 'old' data translation?
(What is ... ) the 9321 ... >10GOhm..... (one channel, DC 😉 )
Got the idea?
When not sending a signal, the output impedance of the generator is about 7.5 MOhm. The input impedance of the DT DAQ is 10 MOhm, and of the NI DAQ it is >10GOhm.
I don't get the idea, though? From what I understand, because the impedances involved in my torque transducer are relatively small (<100 Ohms), these would make minuscule differences nothing like the factor of ~2 that I'm seeing.