Hi everyone,
I'm using NI USRP 2955, LabVIEW 2021 and the built in project code the comes with USRP driver examples to observe the spectrum received by 4 RX channels of USRP. I have slightly modified the code to retune frequency, gain and sampling frequency of 4-RX channels of USRP at run time.
The behavior of the spectrum that appears after frequency tuning at run time is random. At times, it appears inverted (or flipped) i.e. for example if I'm observing a band from 80MHz to 120MHz centered at RF frequency of 100MHz, the spectrum shows actual 89MHz frequency at 111MHz and same is the case for rest of the frequencies in the band.
The IF2 of twinRX (USRP 2955) is 150MHz and digitizer sampling frequency is 200MS/s. This means after downconvertion, the spectrum will be centered around 150MHz and after digitization, same will appear at 50MHz but inverted. So the DDC implemented in FPGA (comes with niUsrp reference example codes) should take an IF frequency shift of -50MHz to shift the spectrum to 0Hz. This IF shift value is computed at Host end using a dll called inside the following VI to configure LO.
C:\Program Files\National Instruments\LabVIEW 2021\instr.lib\niUsrpRio\Config\v1\Host\Public\Configure LO.vi
This VI takes desired frequency to be tuned as input and returns the required IF frequency shift for the DDC. This shift value is either +50MHz or -50MHz for a fixed tuned RF frequency. When its +50MHz, spectrum is inverted and when its -50MHz it remains unchanged. I wanted to know the reason for the +ve and -ve sign change, should it not be fixed -50MHz as explained earlier? and why is the behavior random i.e. if its -50MHz and I retuned the same RF frequency again, it comes to +50MHz which flips the spectrum. I have enabled all 4 RX channels whenever the change in frequency of gain is required, they are all retuned.
Thanks