I’m working on a project where I need the frequency tuned by the LimeSDR (USB) to be known exactly, that is exactly with respect to a 10 MHz reference signal provided on the REF CLK In connector.
I’ve encountered a couple of unexpected hurdles. One of these has a straightforward explanation, the other is mysterious.
10 MHz sample clock isn’t 10 MHz. If I request a sample rate of 10 MHz with LMS_SetSampleRate, and then use LMS_GetSampleRate, the response is not 10 MHz, and indeed measurement with an oscilloscope shows that the sample clock is not quite at the same frequency as the 10 MHz reference. The “problem” applies broadly to round-number sample rates. The reason here can be found in LMS7002M.cpp, in SetFrequencyCGEN. In that function iHdiv is chosen to be in the middle of the range where the VCO can lock. However, because the reference clock runs at 30.72 MHz, which has a factor of 3 in it, you won’t be able to produce clocks of 5, 10, 20 etc MHz unless (iHdiv + 1) is a multiple of three. Indeed modifying the code so that multiples of 3 are preferred (if they are in range) solves this problem.
The second problem is much stranger. It seems that the frequencies produced by the NCO are not exactly those that are advertised.
I expect frequencies given by N * decimation * sample_rate/ 2^32, and indeed using LMS_SetNCOFrequency and LMS_GetNCOFrequency gives responses consistent with this. But testing in the real world seems to suggest small errors in the NCO frequency. I’m testing this by connecting an external analog synthesizer to the LimeSDR RX, and measuring the progression of the signal phase over time. This is a little tricky, as you have to carefully choose which frequencies can work. My analog synthesizer tunes integer frequencies. The LimeSDR LO can tune integer frequencies that are multiples of 1875 Hz. If the input and the LO match exactly, its necessary to turn off the RX DC corrector to see the signal for more than a few hundred ms:
With the NCO frequency set to 0, the LO and the external synthesizer stay completely in phase (after they’ve both warmed up, anyway). But as soon as I start using the NCO, some strange things happen. With a 20 MHz rf clock, the NCO frequency resolution is about 4.5 mHz, and integer frequencies can be tuned
every 78,125 Hz. Its safest to pick frequencies that are multiples of 234,375 - which both the LO and NCO can tune. What happens depends on what revision FPGA firmware I have. I started with an older firmware, rev 12, and got nearly consistent results where the NCO would tune a frequency that was off by 1/2 of the smallest tunable step-size (ie, off by ~ 2.3 mHz), which honestly, seemed impossible. It was quite consistent, about 80% of the time I got frequencies that were off by half the smallest tuning increment. The other 20 % of the time it wasn’t off.
I upgraded to the most recent firmware, and things are less consistent, the frequency seems to always be off be a few mHz, but the exact amount is different depending on what frequency I try to tune.
I’m working around this right now by doing all the NCO frequency shifting (and filtering) in software, but it would be lovely to figure out why it’s not doing the right thing.