Is there any way to Calibrate the 50 and 75 ohm output on the PXI-5402 to be closer to nominal?
I have measured the output impedance on an Agilent 3458A and the 50 ohm impedance is about 50.497 ohms. Using FGEN-SFP 2.4.5, when I connect the 5402 to the 50 ohm input on the Fluke 5790A AC Measurement Standard, I find that the 5402 does not meet manufacturer's specs for Amplitude Accuracy. I have the output impedance at 50 ohms and load impedance set to "same as output."
It does, however, meet manufacturer's spec if I set the Output Impedance to 50 ohms, the Load Impedance to ">1 MOhm", and connect to the High Impedance input on the 5790A.
It also meets the spec when I use a 50 ohm termination that is close to 50.497 ohms.
Note that I could not find a specification for the 50 ohm output impedance. Do I need to send this in for repair? Thanks for your help.
Could you be a little more specific how the 5402 is not meeting spec? In particular:
1) At what frequency and amplitude are you measuring?
2) What is the amplitude indicated on the Fluke?
3) When you say it's meeting spec with a 50.497 ohm termination, what does that termination consist of and what are the numbers you read (and how are you measuring the voltage)? How do you know what the impedance of this termination is at your measuring frequency?
The reason I ask these questions is that it may be that error tolerances are piling up. The Fluke has a certain accuracy specification, and so does the 5402. Are the results outside the combination of the two limits?
Also, something more complex may be happening. Note that the Fluke has a specified accuracy given that the source is a perfect 50 ohms. Similarly, the 5402 has a specified accuracy given that the load is a perfect 50 ohms. If either instrument varies from a perfect 50 ohms, the other is likely to indicate incorrectly. And it's also possible that both deviate from 50 ohms. In RF we call this mismatch error, and that error is calculable if the worst-case SWRs of the source and load are known. We do have the SWR spec for the Fluke (1.05, indicating that the highest output impedance is 52.5 ohms and the lowest is 47.6 ohms); unfortunately there is no SWR or output impedance spec over frequency for the 5402. Nonetheless, we can take a stab at the error if we look at your numbers.
Hope this helps,
I am trying to verify that the 5402 meets the published specifications based on the 5402/5406 Specification manual dated June 2008.
1) I am measuring 1 Vpp @ 50 kHz. The published specification is ±(1% of amplitude + 1 mV) = 0.011 Vpp. In other words, the voltage present at the output terminal, Vo, should be: 0.989 ≤ Vo ≤ 1.011
2) I have three different cases on the Fluke 5790A. Note that the Fluke measures in Vrms. I am including the manufacturer’s spec for the 5790A.
a) 5402 output impedance = 50 ohms; load impedance = “> 1 Mohm”:
The High Impedance input on the Fluke has a spec of ±(51 ppm + 2.0 uV)
Fluke reading = 0.3535 Vrms = 1.000 Vpp
Fluke specification = 20 uVrms
b) 5402 output impedance = 50 ohms; load impedance = “same as output”
The 50 ohm input on the Fluke has a spec of ±(0.4% + 300 uV)
Fluke reading = 0.3494 Vrms = 0.988 Vpp => OUT OF SPEC
Fluke specification = 1.7 mVrms
c) 5402 output impedance = 50 ohms; load impedance = “same as output”
A 50 ohm termination with a value of 50.589 ohms placed on the High Impedance input on the Fluke
Fluke reading = 0.3516 Vrms = 0.994 Vpp
Fluke specification = 20 uVrms
3) See 2c. At DC, the output impedance of the 5402 is 50.497 ohms (as measured on an HP 3458A dmm). I measured several 50 ohm terminations at DC and the closest I could come to that value was 50.589 ohms. I placed the 50.589 ohm termination across the Hi-Z input on the Fluke. I do not know what the impedance is at 50 kHz.
Testing this at such a low frequency, I don’t see how the SWRs will come into play. Since neither the source nor the load are or will ever be a “perfect” 50 ohms, I question the 5402 specification to begin with. Perhaps the published specification for amplitude accuracy only applies to high impedance loads whereas the flatness specification, for example, is applicable to any source/load impedance.
Thanks for your help,
Well, it sounds like the discrepancy you're seeing could be explained by a stack-up of tolerances. I'm not saying necessarily that is what's happening; just that it could be. Here's how I break it down:
1) 5402 accuracy spec: "±1.0% of amplitude ±1 mV." I'll say right off the top that this is a little vague, because it doesn't tell us whether the amplitude that we're adding 1 mV to is rms, peak, or peak-to-peak. I'm going to take the middle ground and say that it's peak. So the worst case on your 1 Vpp signal, which is a 0.500 Vp signal, would be 0.494 Vp on the low side, or 0.3493 Vrms.
2) 5790A accuracy spec: I'm assuming you're going with the 1 year absolute uncertainty spec for the 700 mV range, which is ±0.4% of reading ±300 uV. If we go with the assumption that all numbers are rms, then I get your number of ±1.7 mVrms.
3) Mismatch loss: Even though this is only 50 kHz, mismatch loss can very well be a factor. It can happen at any frequency, not just frequencies we think of as RF. The added uncertainty is the product of the magnitudes of the reflection coefficients for source and load. Unfortunately, the 5402 has no output SWR or output impedance spec, but we'll use your measurement of 50.497 ohms, which gives a SWR of 1.010 and a reflection coefficient of 0.004945. The 5790A has a specified SWR of 1.005 at 1 kHz, which we'll use at 50 kHz. That gives a reflection coefficient of 0.0025. The good news is that the product of these is very much negligible, at 12.3 ppm, or 4.4 uVrms out of our 0.3535 Vrms. But it's important to go through the exercise of these calculations whenever source and/or load impedances are uncertain.
Anyway, combine the above contributions, and the worst case on the low side comes to 0.3476 Vrms. Your reading of 0.3494 Vrms is within that, albeit only modestly so. I don't have an opinion as to whether that merits a recalibration. If you have another accurate AC instrument at hand (either generator or analyzer), you might compare its answers against the other instruments.
Hope this helps,
Thanks for all your help. I've used the 5790A to measure the same output from an HP 33120A sig gen, an Agilent 33250A sig gen, and Fluke 5800A scope calibrator. I'm confident that the 5790A is performing as expected. I've also measured the 5402 output on a Tek scope and found it to be out of spec. I understand that it really makes no sense to verify a 1% signal generator with a 2% scope. However, that gives me extra confidence that the 5402 is giving me problems, not my measurement system.
I believe I can shed some light on the cause of this issue. The AC accuracy specification is for a high impendence load. I understand that this is difficult to interpret based on how the specifications document was written. I have submitted a request to have this changed so that it is clearer. Based on the data you have provided it looks like your card is functioning properly. If you would like to further discuss methods to measure and compensate for variances in output and load impendence you can contact me at firstname.lastname@example.org and we can better address your specific needs. Otherwise it appears that your card is within limits for a high Z load. I do apologize for this discrepancy and assure you that we are constantly working to improve both our products and documentation.