Something wrong with calibration procedure

Hello everyone!
This is my first post on this forum. I wrote a c++ application using SoapySDR API working with LimeSDR. I use both channels 2RX+2TX at 20MSps. I am transmitting chirp signal from TX1 (TX2 sends zeros). Output of TX1 is attenuated and split in two signal (using power splitter) and connected to RX1 and RX2.

When everything is fine result is like the one in this picture:

Top right window is spectrum for RX1 and RX2. Bottom is time domain for RX1 (real+ imag)
However sometimes after RX calibrations the result is like this:

Although RX calibration causes problems, the problem seems to be with TX. When I change TX center frequency it looks like this:

Additionally RX calibration breaks DC calibration for TX and I need to recalibrate TX after RX.
I use BAND1 for TX and LNAL for rx, but the results are repeatable for any configuration.
To perform calibration I deactivate streams and activate them after calibrations are complete.

Anyone else noticed problems with calibrations? What could be the cause? How to overcome this?

Thanks Rafal


Good job now why don’t you share us the software so that we can all test it?

Why not. Although be warned project is a bit of a mess.

Project for qtcreator dependent on alglib and fftw (fftw can be removed).

The strange thing is that real and imag amplitude varies in very bizarre way. Considering imag and real as separate signals the frequency on both sides of DC crossing are the same but amplitude is different.

1 Like

I compared all LMS registers and they are identical. Is there any other configuration that i can compare?

I recreated the test i PothosGui. Can anyone try it out?
Procedure to recreate(try multiple times):
Stop button
Cal button
Start button
Connect TX1_1 with RX1_L

When it’s all good it looks like this:

And when its bad it looks like this:

Spectrum is choppy.

Project file:
test.pothos file


Hi @modimo, just to let you know that we’re currently reviewing calibration and there should be an update on this soon. Thanks for sharing your tests also!


Hi @andrewback, any news about the calibration review and update?

As other users, I’d like to express my interest on it, so, I’d really appreciate if an official full “LimeSDR Calibration and Test Procedure” is prepared and described.

Thanks to @modimo for the shared work LimeSDRTest by modimo, it is very well appreciated.


Hi @BelmY,

Just pinged @IgnasJ and @Zack for an update.

1 Like

Hello @modimo,

Checking calibrateAll() function. Could you explain why you remember LNA, TIA, PGA, PAD settings, then set your own, calibrate and set these settings back? This will break your calibration. If you calibrate for some specific LNA, TIA, PGA, PAD settings, then you can not change these settings after calibration and expect to stay calibrated.
What values are for RXDCOffsetMode and TXDCOffsetMode?

1 Like

Well I don’t think changing gains should break the calibrations. If you look at calibrations script(the one that stores all calibrated values for every frequency and takes 8hours). It does not store every possible combination of gains. Corrections are (best to my knowledge) only IQ imbalance and leakage corrections and DC offset calibration. I can’t run tests this week but from what remember it’s either calibration procedure changed my gain settings or the results were somewhat more repeatable with fixed gain settings.
The best results i obtained was when I removed calibration cache before initialization of lime and perform only one calibration at start. It gave 50% chance for successful calibration(without effect described in previous posts).
I even compared every single register value after good and bad calibration and they were identical.
Next week I will probably get my hands on brand new LimeSdr and test if my unit is not broken or something.
I don’t store gain value in pothos example and still reproduce the result.

1 Like

Hi @modimo,

Gain changes puts your transceiver chain in an undefined condition. Can you be sure that is not saturated at these settings? You want to configure DC and IQ calibration at the exact gain settings you are using, since changes in these values can effect optimum calibration values.

What LimeSDR-USB board version are using right now?

I don’t have access to the board now. I got my board in second flock of crowdsupply. I think it’s 1.4. I’ll check on monday to be sure.

What kind of radio LimeSDR is then. If you cannot change gain how do I implement AGC. I know that receiver is not saturated from spectrum shape. When I go into saturation zone a lot IM products come up.


I have the same issue, or similar.
My boards are v1.4s, LimeSuite is latest git master, Ubuntu on a good PC.
We tried several boards ans several PCs.

The issue is: not changing anything, several consecutive runs of the same program, with same parameters, have very different results.

Time to time calibration is good, time to time it is actually bad.

Also, we can reach bad situation 100% of cases if the .ini file, or the gain is not making possible to calibrate correctly.
Nevertheless, the calibration function always answer “Ok”.

I use a new R&S analyzer to verify the output (a older Agilent was giving the same).
My modem makes LTE signal, no MIMO, band 5MHZ, or 10MHz, 20MHz

I made a lot of trials, with several .ini files, supposed to be optimized per frequency band.
EVM is better near 800MHz, and acceptable near 2.6GHz, but the calibration is not working.

I confirm, changing the gain make the issue more frequent, or even the system always fails to calibrate.
But the function: LMS_Calibrate() always returns ‘OK’.

The sequence e of code I use:
LMS_LoadConfig(lms_device, <my_.ini_file>);
LMS_SetSampleRate(lms_device,,4), “”); //rate is LTE standard one, so a division of 30720MS/s
LMS_SetLOFrequency(lms_device,LMS_CH_RX, 0,, “”); // several feq tested 700MHz-3600MHz
LMS_SetLOFrequency(lms_device,LMS_CH_TX, 0,, “”); // Tx/Rx freq are different, gap is defined by 3GPP
LMS_SetGaindB(lms_device, LMS_CH_TX, 0, openair0_cfg->tx_gain[0]),"");
LMS_SetGaindB(lms_device, LMS_CH_RX, 0,);
//value is 0…70, we saw the today limitation: it is not actually in dB, but still the range is 0…70
LMS_SetGaindB(lms_device, LMS_CH_TX, 0,);
LMS_Calibrate(lms_device,LMS_CH_TX,i,device->openair0_cfg->tx_bw,0); = 0;
rx_stream.fifoSize = 2561024;
rx_stream.throughputVsLatency = 0.1;
rx_stream.dataFmt = lms_stream_t::LMS_FMT_I12;
rx_stream.isTx = false;
assertLMS(!LMS_SetupStream(lms_device, &rx_stream),""); = 0;
tx_stream.fifoSize = 256
tx_stream.throughputVsLatency = 0.1;
tx_stream.dataFmt = lms_stream_t::LMS_FMT_I12;
tx_stream.isTx = true;
LMS_SetupStream(lms_device, &tx_stream);

Typical output of Lime above functions

Connecting to device: LimeSDR-USB, media=USB 3.0, module=STREAM, addr=1d50:6108, serial=0009060B00462F14
[INFO] Estimated reference clock 30.7195 MHz
[INFO] Selected reference clock 30.720 MHz
[INFO] LMS7002M cache /home/laurent/.limesuite/LMS7002M_cache_values.db
M=204, N=3, Fvco=1044.480 MHz
16: 00 A8 AA
4760: AA 5A 5D
phase: min 23.8; max 190.6; selected 107.2)
M=204, N=3, Fvco=1044.480 MHz
M=204, N=3, Fvco=1044.480 MHz
16: 66 DC 3D
16: AA 52 D5
phase: min 11.9; max 178.7; selected 95.3)
M=204, N=3, Fvco=1044.480 MHz
Start with samplerate=30720000.000000, tx freq= 2630000000.000000
CGEN: Freq=491.52 MHz, VCO=1.96608 GHz, INT=63, FRAC=0, DIV_OUTCH_CGEN=1
CGEN ICT_VCO_CGEN changed to 31
M=204, N=12, Fvco=1044.480 MHz
16: 55 A5 AA
16: 55 A5 AA
16: 55 AD 2A
16: 55 AD 2A
16: F5 BB 5C
16: F7 DB 5D
19: AB 5A 55
phase: min 95.3; max 243.5; selected 169.4)
M=204, N=12, Fvco=1044.480 MHz
M=204, N=12, Fvco=1044.480 MHz
16: 55 AD 2A
16: 55 AD 2A
16: 55 AD 2A
16: 55 AD 2A
16: 57 AD 3B
16: 77 9D 39
16: 77 DD 3D
16: 66 DC 3D
16: AA 5C 75
16: AA 58 75
31: AA 5A 75
16: AA 5A D5
phase: min 63.5; max 174.7; selected 119.1)
M=204, N=12, Fvco=1044.480 MHz
Call set gain: Tx= 20.000000, Rx= -5.000000
band=30720000.000000 (30720000.000000)
MCU algorithm time: 10 ms
MCU programming : 16384/16384
MCU Programming finished, 740 ms
MCU Ref. clock: 30.72 MHz
MCU algorithm time: 173 ms
MCU algorithm time: 1 ms
MCU Ref. clock: 30.72 MHz
MCU algorithm time: 110 ms
Rx calibration using RSSI INTERNAL ON BOARD loopback
Rx ch.A @ 2510 MHz, BW: 30.72 MHz, RF input: LNAH, PGA: 31, LNA: 15, TIA: 3
Performed by: MCU

MCU algorithm time: 1 ms
Current MCU firmware: 3, DC/IQ calibration full
MCU Ref. clock: 30.72 MHz
MCU algorithm time: 223 ms
Tx calibration using RSSI MCU INTERNAL ON BOARD loopback
Tx ch.A @ 2630 MHz, BW: 30.72 MHz, RF output: BAND2, Gain: 20
Performed by: MCU

MCU algorithm time: 1 ms
Current MCU firmware: 3, DC/IQ calibration full
MCU Ref. clock: 30.72 MHz
MCU algorithm time: 312 ms
init done

Example of outputs
Good initialization result:

bad initialization result:

Best result, still not perfect at 2.68GHz

LMS7002M is a direct conversion transceiver, hence one of the main drawbacks is IQ imperfection which is dependent on a lot of factors, including gain settings. The calibration routine is static in a sense, that it optimizes settings for only the active setting set. It does not perform prediction of optimal settings at other settings, since most of them are chip and environment dependent hence hard to predict. It could be possible to calibrate a range of settings, but the values should be updated according to the new gain (for example) settings by the user or baseband. Since the calibration is done by the internal MCU, which is a slave device, it cannot automatically update the calibration values for different user settings by itself. Hence for AGC purposes, a batch of calibrations (could be in fixed steps) should be performed prior to use and the settings then uploaded dynamically by the baseband.

Please note, that not every setting change may need a new calibration procedure. For example, optimal calibrations settings at PGA gain of 0dB and 1dB will be very similar, but not the same, hence a decrease in image or DC level suppression might be observed.

We are going a little off topic. The main question is why rx calibrations break TX signal.
My wild guess would be something bad happening in transmit interpolator.
Can you guys at LimeSDR reproduce this effect?

I got access to spectrum analyser today and I can confirm that transmit gets broken after rx calibration.
Correct spectrum (signal transmitted on LSB from -5 to 0MHz LO=900MHz). No signal at USB.

after rx calibration heavy leakage to USB and LSB distortion visible.

Hello @modimo,

We are checking it. There are no issues when we use LimeSuiteGUI to calibrate. It looks like there are some issues in upper software layers. Will update later.

I was able to reproduce the problem and made a fix.
You can check if it is fixed for you by compiling the latest LimeSuite from GitHub (master branch).

Great job IgnasJ!. It fixed the problem for me.
Thank you.

1 Like