Hi, I'm new to Labview 2018 and the USRP-2953R (CBX-120)
Communicating with the 2953R over 10Gbps Ethernet with the TX1/RX1 output directly connected to the RX2 input of RF0. Because I'm not using the PCIe interface, I'm unable to run the RIO examples.
I have been checking out the example given in https://forums.ni.com/t5/Software-Defined-Radio/Single-USRP-Synchronized-TX-and-RX-Finite/ta-p/35056... and trying to get it to work with the 2953R as well as trying to understand the code.
Straight-off, I had to change the carrier frequency to 2GHz as the default 915MHz was out of band for this USRP.
Seeking some understanding for:
1. why there would be an underflow error from the TX as the "end of data?" input to the TX write VI was wired True. When probed, the input to the TX write VI was 2000 elements.
2. (this sounds really stupid) where the data for the TX is coming from? In the block diagram, it's wired from a cluster named "delay-sinewave-delay" but for the life of me I can't see to find how/where the sine wave is generated or is it a constant that is hard-coded into the VI?
Thanks!