Hi everyone, my name is Jorge Gomez, and I'm wondering if you can help me with the following problem.
We use a USRP 2954 connected to a NUC i7 with 64GB RAM and a 10Gbps connection using a QNAP adapter to ensure the speed (Please see attached pictures). We use UHD 3.15, and I applied all the necessary updates to the ethernet adapter (increase the MTU to 9000, etc..)
USRP 2954
QNAP Adapter for 10Gbps connection
Our goal is to transmit an OFDM signal of 25k samples at 50MS/s and transmitted at 3.5 GHz continuously. We ensure that the signal does not saturate the ADC (amplitude set to 0.7), and the gain setting of the USRP is 20. However, when we use our receiver to recover our signal, the received signal is distorted, as observed in the following plot.
Captured signal
We do not see any underrun messages when running the code in our NUC controller.
Screenshot of the running code
What we are expecting is a signal similar to this one.
Desired signal
I tried modifying the sampling speed, but it didn't help the received signal. I noticed that when the signal was reduced from 25k to 2500 hundred, it was correctly recovered, so I tried increasing the size of the buffer. Still, no improvement was observed (original buffer size 9960).
// allocate a buffer which we re-use for each channel
if (spb == 0) {
spb = tx_stream->get_max_num_samps() * 10; //I tried changing this value from 10 to 100. It didn't help me at all
}
std::vector<std::complex<float>> buff(spb);
std::vector<std::complex<float>*> buffs(channel_nums.size(), &buff.front());
I added the codes I used to configure the USRP tx_waveform4.cpp, wavetable2.hpp, and a copy of the signal we use for this experiment.
Any help to solve this issue would be much appreciated.
Cheers.