I’m currently developing an application where I have to emit single frequencies signals for a given period of time. Let’s say I have to transmit a sinusoidal signal of frequency 21 MHz for 10 seconds. My system configuration is a Lime SDR Mini V1.2 up-to-date and I’m developing it in C++ using as interface the LMS API. I took as base for my code the basicTX.cpp example that can be found on GitHub.
I could without a problem configure it to transmit at 100 MHz. I could use the function to set the NCO frequency as well, in order to change this center frequency in, let’s say, 10 MHz. The output was just beautiful.
My problem started when I set the center frequency to 21 MHz. Without setting the NCO frequency myself, I saw that it was set automatically to -9 MHz, what makes sense to me. But I would get in each run an error of TX calibration and no decent output.
Next I tried to set the center frequency to 36 MHz and the NCO frequency to -15 MHz. This way the program was able to finish successfully the TX calibration, but still no output signal no matter what I’d do with all the other parameters. The TX gain was also set to 1.
I finally tried to see the output for signals of frequencies in between, without setting any NCO frequency. The last frequency that would give me a decent output was around 60 MHz. In 50 MHz the signal was getting deformed and smaller. For 36 MHz, it was already no sinusoidal anymore but a superposition of some ones.
Any idea why this is happening? I doubled checked all the parameters and code and it should be working. It’s just that the output is not what I’m expecting. Attached my output @ 100 MHz and @ 36 MHz.
The output for all the mentioned cases where the frequency of the signal was set to 21 MHz was nearly 0V continuous (no output at all).