From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

problems with getting accurate power readings during sweep

Ok, so I have an Agilent N1912A power meter with N1921A power sensors on Ch A & B. I have RF power (from signal generator/ amplifier) going into input of a coupler, then Ch A connected to the coupling  and Ch B connected to the coupler output. I am trying to measure the loss of the coupling (nominally 20 dB)  by just taking the difference in power readings (Ch B- Ch A).  I  have a VI  to read the power from both channels as a signal generator sweeps through different power levels. For the most part it works, except the problem is I am a getting strange "kink" in power reading whenever the power sensor reading is around -10 dBm...see first image attached. Because of this, i get over 0.5 dB change in the coupling loss i am trying to measure, which is unacceptable.

 

Now with extensive troubleshooting I have ruled out things like amplifier having unstable gain, or faulty sensors. One interesting thing I have noticed - it tends to improve significantly when i change sweepintervals to be much smaller (e.g. 0.1 dB intervals instead of 0.5 dB), which makes me think it is something I'm not setting properly in my VI when I am reading the power from each channel. If i use 0.1 dB intervals the kink is much smaller..resulting in a more stable measurement of coupling loss. See second image attached.

 

Anyway, wondering if there are any ideas what could be the source of the issue.  As you can see from the images, this always happens at -10 dBm..dont' know what's so special about  -10 dBm  (where's the shrug emoticon?? :)) Anyway,  I know can greatly alleviate this problem by always using really small intervals of 0.05 dB or something, but that is just so not practical for my application...(would be too slow). I want to be able to use 0.5 dB intervals and have smooth power readings.

 

 

 

 

 

 

Download All
0 Kudos
Message 1 of 6
(2,866 Views)

@neets wrote:

Ok, so I have an Agilent N1912A power meter with N1921A power sensors on Ch A & B. I have RF power (from signal generator/ amplifier) going into input of a coupler, then Ch A connected to the coupling  and Ch B connected to the coupler output. I am trying to measure the loss of the coupling (nominally 20 dB)  by just taking the difference in power readings (Ch B- Ch A).  I  have a VI  to read the power from both channels as a signal generator sweeps through different power levels. For the most part it works, except the problem is I am a getting strange "kink" in power reading whenever the power sensor reading is around -10 dBm...see first image attached. Because of this, i get over 0.5 dB change in the coupling loss i am trying to measure, which is unacceptable.

 

Now with extensive troubleshooting I have ruled out things like amplifier having unstable gain, or faulty sensors. One interesting thing I have noticed - it tends to improve significantly when i change sweepintervals to be much smaller (e.g. 0.1 dB intervals instead of 0.5 dB), which makes me think it is something I'm not setting properly in my VI when I am reading the power from each channel. If i use 0.1 dB intervals the kink is much smaller..resulting in a more stable measurement of coupling loss. See second image attached.

 

Anyway, wondering if there are any ideas what could be the source of the issue.  As you can see from the images, this always happens at -10 dBm..dont' know what's so special about  -10 dBm  (where's the shrug emoticon?? :)) Anyway,  I know can greatly alleviate this problem by always using really small intervals of 0.05 dB or something, but that is just so not practical for my application...(would be too slow). I want to be able to use 0.5 dB intervals and have smooth power readings.

 

 

 

 

 

 


So what are you trying to accomplish here?  I would be more interested in a coupler's response over frequency.  Usually if you're talking about gain/loss over power, you are measuring the characteristics of an amplifier.  (Which I understand you aren't doing, as the DUT is the coupler.)

Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
0 Kudos
Message 2 of 6
(2,828 Views)

Does anything in the system change gain/attenuation settings when you cross the 10 dB point? For example suppose that the signal generator changes ranges at 10 dB. If the calibration of one of the ranges is off by a smll amount (possibly within specifications but not perfect), it could generate the kind of effect you see.

 

Try putting a 3 dB attenuator (or any other handy value other than 10 dB) in various ports to see if the kink moves with the source or the measurement.

 

Lynn

0 Kudos
Message 3 of 6
(2,814 Views)

thanks for the replies!

 

as for what i'm trying to accomplish in the end --i'm just trying to obtain a calibration offset for when I put a DUT in there (at output of coupler). Basically the loss or difference between what the DUT will see (Channel B) and what I will be measuring at the coupling (Channel A). So then when i try to do a power sweep on DUT, i can just calculate the power going into the DUT buy taking the Channel A reading and adding an offset. Hope that makes sense...

 

Anyway, I too was thinking it could be something in the signal generator...but i tried putting 6 dB attenuator in there, and the kink moves...it is always whenever the Channel A input sensor starts to go above -10 dBm. I suspect I don't see this problem with the CHannel B input sensor because the power is always much higher. 

 

Again, what I don't get is that if in LabVIEW i slow down the sweep measurement (either by putting 2.5 second delays in front of every CHannel A power reading) or by running the program with "Highlight Execution" pressed,  or by using very small intervals (0.1 dB instead of 0.5 dBm). the problem disappears!! It is so puzzling. Right now i'm using that as a work-around (putting several seconds wait time in front of each Ch A "READ?" query) but wondering if there is anything else I can do. It will be a bit of a bummer if each power sweep I do will take over a minute to do...

 

 

0 Kudos
Message 4 of 6
(2,811 Views)

The reason why I thought the signal generator was out of the question was because it wouldn't really matter if it was ranging to the next amplifer because the jump would be on both sensors and the loss would be the same.  Could it be that your power meter is autoranging at -10 dBm?

 

What I don't get is why the reading would be so drastically off depending on the size of the power jumps.  Even if the setup was having major issues, the loss should still be the same since the source is the same.  Unless, of course, one youf your power heads was not behaving itself.  In fact, I remember something about those sensors having funny issues that could be solved by putting an attenuator on it.  Something about harmonics being generated by the power head itself.

Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
0 Kudos
Message 5 of 6
(2,801 Views)
Why don't you attach an image of the block diagram or the actual VI?
0 Kudos
Message 6 of 6
(2,798 Views)