08-13-2024 06:59 AM
I am working through making a Labview program to verify the phase reading on the 4 AI channels of the NI-9234 card. After perusing these forums from over the years of similar questions I found a similar procedure and tried working with it. It works, sort of.
The issue I am having is that the phase reading is constantly changing between 0-360 degrees, the test tolerance is less than 1 degree. What do I add in to snapshot the phase correctly? Thanks in advance for the help!
08-14-2024 08:00 AM
Phase related to what?
Usually you measure phase differences. One channel is your reference representing the excitation and another is the signal of interest, or you trigger your reading at a known time(phase) of your excitation.
The spec say simultanious sampling, and a phase mismatch
Phase (fin in kHz) |
(fin * 0.045° + 0.04 maximum) |
I would use a common signalsource with a low output impedance and drive all channels with the same signal.
and check the phase differences. (same cable length assumed)
If you want to drive it to some extent you can do multiple measurement with swappend input channels and match the interchannel delays to some ns (or even fractions of a ns) ... (at 10 kHz 1° is ~278ns) and check if if your 9234 has reproduceable delays.
08-14-2024 01:44 PM
Hi Henrik!
Yes you are correct the calibration manual is asking to compare the difference between the 4 channels, I used an Agilent 33250 function generator paralleled on each channel for the source, and I modified the procedure (see attached) to allow for data collection of the 4 channels at once. I was able to snapshot a phase reading of all the channels at once (I think) and compare the difference to the spec (+/- 0.085°). Does that look like it would do it properly? I am very new to LabView and learning it on the fly in our lab as new NI devices come in!