For pretty much anything cellular related, a highly stable clock is required. The VCTCXO on the LimeSDR products is not sufficient to meet the relevant 3GPP specs, but it should be sufficient for practical experiments without having to permanently attach some bulky external OCXO or GPS-DO.
I’m currently looking for best practises on how people are both doing the actual calibration procedure, as well as how to manage the resulting correction value.
However, what is the designated procedure for initial calibration of the clock error? Using a random LimeSDR mini has about 1kHz carrier clock error after LMS_Init(). There’s the VCTCXO DAC in hardware that can be usd to compensate for the error. However:
- what’s the standard software that people use to calibrate the error?
I’m looking for something that can be used e.g. with an external reference RF generator. Or even something like https://github.com/ttsou/kalibrate which would use a public GSM carrier as reference to calibrate against.
- how and where is such a clock correction value handled by LimeSuite? After all, it would be rather ugly if each and every application program has to maintain some application specific configuration on where to store the correction value for each given unit.
In the ideal world [tm], I would hope there’s some non-volatile storage on each LimeSDR board (like an I2C EEPROM), where the unit-individual clock calibration value can be stored. This way, one can swap one LimeSDR board for another one and automatically the unit-specific clock correction value would be used, completely transparent to any application.