Signal Conditioning

cancel
Showing results for 
Search instead for 
Did you mean: 

analog input DAQ impedance matching phase shift

Hi,

 

I am using a two analog input of a USB 6366 DAQ to measure my voltage and current signals across a DUT (Air capacitor with C = 100 pF). Sample rate = 2MHz, number of samples 1M. Frequencies 350Hz - 1950 Hz.

Because my voltage divider is currently out of order, I'm measuring my voltages with a 100:1 test probe (input impedance 100M // 5pF), with a 1MOhm resistor connected parallel to the DAQ channel input to lower the input impedance of the DAQ. I used the 1Mohm resistor cause the test probe worked well with the oszilloscope, which had an 1Mohm input impedance so i assumed it would be sufficient.

I seem to be able to get the correct RMS values, but I am getting a phase shift between the two signals that are > 90°. I'm expecting values around 89.89°. Does this mean that my impedance matching using the 1Mohm is incorrect? Are there any other ways to improve this?

 

Thanks

0 Kudos
Message 1 of 2
(1,034 Views)

Have you compensated / adjusted the probe capacity to your input imdedance?

 

Have a look at the schematic of your probe, you will find n 'unknown' parts.

 

Now draw all parts (plus parasitic impedances ?) in a schematic with your DUT

 

Do some measurements with known DUTs , calculate the unknown impedances.

 

....

Guess why serious LCR meters use four channels and open/short calibration 😉

 

Greetings from Germany
Henrik

LV since v3.1

“ground” is a convenient fantasy

'˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'


Message 2 of 2
(980 Views)