Fast tuning requests leading to some sample flow interruptions

Hello, I’ve posted a topic this morning about some issues when trying to Doppler correct satellites, but I’ve deleted it since the issues can be mitigated in software and thus the details are irrelevant.

Nevertheless, I have a question for Lime developers: with Lime hardware I have some very short sample flow interruptions and some artefacts appearing and manifesting as audio pops while device center frequency tuning requests are made. This only manifests on my Lime hardware (mini and LimeNET micro).
Is this a known software issue? I don’t think it’s caused by something in hardware. Using the latest liblimesuite and SoapyLMS7 plugin.

(I’m not a Lime developer but) It happens with all SDR hardware, when you retune the frequency, it is caused by invalid input and discontinuities in the samples. For simplicity imagine the samples from a a single real ADC are 12,13,15,17,18 (and then you retune the frequency), there is a delay while the PLL(s) lock from the old to the new frequency (during which time the ADC is not sampling a valid input), then due to the change in frequency the samples could be 128,130,131,133, … The invalid samples and that discontinuity in the time domain may manifest as a temporary raise in the noise floor across all frequencies in the frequency domain, resulting in a loud click or pop.





The sine wave vector is just the values of sine(x), where x is every 30 degrees until it repeats. e.g. (0.0,0.5,0.866,1 ,0.866,0.5,0.0,-0.5,-0.866,-1 ,-0.866,-0.5) In the one bad sample vector I changed 1 value in the valid sine wave and then I padded out the vector with 15 other valid sine wave vectors (I should have used more valid sine wave), it is not perfect to show the effect, but it is good enough (you are seeing far too much of the actual signal, and not enough noise).

What some software does when you tell it to return, it slowly attenuates the incoming samples to zero, then zero pads it during the retune, and then after it confirms all the PLL’s(Phase Locked Loop) have locked to the new frequency and that the ADC is returning valid samples, it slowly ramps back up the signal level from zero to avoid any discontinuity.

What you may be able to do with the LimeSDR is retune digitally on the TSP and avoid some delay and discontinuity, provided the analogue RX mixer does not need to be retuned to track the signal.

1 Like

Thanks, this is a very good explanation. However, just checking the numbers and doing some calculations, it seems to me that this effect is maybe a little too pronounced than it should be?
According to the datasheet, the PLL settling time of the LMS7002 is between 50 and 150 microseconds, with 50 being the more typical value. At a sample rate of 48 Msps, that should lead to a window of roughly 4800 samples with abnormal values. I would have expected these samples to be spread in frequency domain quite a bit once the FFT has been performed. Furthermore, I would have expected that downsampling with filtering to a final sample rate of 8 ksps would greatly reduce the number of erroneous audio samples to something not very observable.
Instead, the audio artefacts suggest a longer interruption in valid samples, on the order of 1 or 2 milliseconds, enough to be discernable. This looks like maybe an order of magnitude more than expected?
If my calculations are wrong and this is within a reasonable operation range, it might be nice to have official confirmation.

This does not impact my application scenario a lot since I can tune in software and be issuing tune requests only once every 20 - 50 seconds, but it is still puzzling to me why the plutosdr for example exhibits no such discernable demodulation artefacts in the same scenario (seen with real signal being demodulated while tuning every 100 ms). I’d also add that there are software mitigations, but in some cases tuning requests might not be avoidable and I can’t see this dealing very well with digital signals. Hopefully this could be just some easily fixed software issue.

Is it possible that what I am seeing is the result of the calibration procedure running at every tune request irrespective of the frequency delta? And if yes, would there be a way to avoid it with high-level API like gnuradio or SoapySDR?