TX DC calibration not working

I’ve noticed that TX calibration doesn’t include DC offset calibration.
Manual adjustments of offset works fine.

Spectrum after automatic calibration (DC I=0 , DC Q=0):

Spectrum after manual calibration (DC I=-39, DC Q=-100):

Is there any way of automatic DC calibration?

Hi, Tx calibration does include dc offset. Calibration sets analog DC offsets which are located in:
0x05C3[10:0] DCWR_TXAI
0x05C4[10:0] DCWR_TXAQ
0x05C5[10:0] DCWR_TXBI
0x05C6[10:0] DCWR_TXBQ

What you are changing is digital dc offset.
0x0204[15:8] DCCORI
0x0204[7:0] DCCORQ

Still, I have to use manual adjustments to remove DC spike on spectrum every time.

What causes need for adjustments of digital dc offset, if calibration should remove DC spike with analog offsets?

I wonder about this too. I can not remove the DC spike often times.

Digital offsets are not necessary, analog offsets are enough. The thing is you are calibrating at 2.5 GHz center frequency. The calibration is being done using chip’s internal loopback, as the frequency goes to the high end of the spectrum at some point the characteristics of path start to differentiate between chip’s internal loopback and the board’s external path. So the calibrations works fine, but the conditions are slightly different during the calibration and after it’s finished. At these frequencies running the calibration with external loopback should yield better results.

Ok, I’ve checked “useExtLoopback” (“PC” method in LimeSuite).
Basically nothing changed.

Still, levels are:
-34 dBm peak with “Onetone” signal without calibration
-60 dBm peak with “zero” signal, after “MCU” calibration
-58 dBm peak with “zero” signal, after “PC” calibration (even worst?)
-92 dBm peak with “zero” signal, after manual DC adjustment (I=-54 Q=74)

Tried to do calibrations with and without “RF loopback ch.A” enabled.

I believe @ricardas meant that you should actually use external RF loopback SMA cable (possibly with at least 20 dB attenuator between TX and RX port) to do this. Is that what you did?

I’ve tried both using on-board RF switch (“RF loopback ch.A”) and external cables with 40 dB attenuator. Same results.

Using LimeSDR receiver and LimeSuite FFT I’ve got these levels:

-35 dBFS - onetone signal
-65 dBFS - zero signal, after “MCU” calibration
-64 dBFS - zero signal, after “PC” calibration
-79 dBFS - zero signal, after manual DC offset calibration (I=-1 , Q=-18)

For example setup, I use:

  1. set TX frequency to 2400.0 MHz
  2. set RX frequency to 2399.9 MHz
  3. set sample rate to 1 MHz
  4. setup gains, antennas, etc (BAND2, LNAH)
  5. enable on-board (not on-chip!) channel loopback
  6. run FFT
  7. run different signals. Single tone is built-in into LimeSuite, zero signal is 2048 bytes long file filled with zero bytes.

Observe level at +100 KHz on the receiver.

External high quality spectrum analyzer shows basically the same picture of this spike-to-noise relation.

Are you running calibration from your own code?
From LimeSuite code I see it’s not using external loopback when “PC” method is selected in GUI.

what are RxTSP digital RSSI values at these conditions? Calibrations are not using FFT, they rely on RSSI values, so its measured value is more suspectable to be affected by nearby signals/noise as the measured Tx DC signal level drops close to noise floor.

RSSI are fine.
For example, with SXT=2400.100, SXR=2400.000 (to remove RX DC influence) I’ve got those results:

Automatic calibration:
DCCMP_TXA = -492
RSSI = 0x00101
peak = -54 dBFS (using LimeSDR FFT viewer)

After manual adjustments of gain (analog gains, not digital this time).
DCCMP_TXA = -408
RSSI = 0x00049
peak = -72 (using LimeSDR FFT viewer)

RSSI dropped almost 3 times, and signal about 20 dB lower.

Every clue gives hint about something wrong with calibration itself.

PC" method is using external calibration:

Look at function:
int LMS7002M::CalibrateTx(float_type bandwidth_Hz, bool useExtLoopback)

( https://github.com/myriadrf/LimeSuite/blob/master/src/lms7002m/LMS7002M_RxTxCalibrations.cpp )

Argument “useExtLoopback” is 0 for MCU and 1 for PC.
On-board loopback is automatically switched ON on LimeSDR.

Correct about that function, but currently GUI uses the API function LMS_Calibrate() which always passes false to the useExtLoopback parameter.

Isn’t this one used?

Yes, but looking further inside it calls:

which calls the original calibration function, and the flag is interpreted basicaly just as whether to use MCU or not, external loopback parameter is always false.

Anyway, that external loopback calibration was experimental and used with the older chip version and measurements were done using FFT on PC, I’m not sure if it is still usable with the latest chip version.

I should get the board after couple of days, will check out whats going on in there

Can we print out values that this function is called with, for example in debug mode?
I have board near me ready for tests, I can test the results.

As ricardas said, external loopback not used.

But the problem remains.

I’ve created my own calibration procedure, which almost eliminates DC using any loopback.
I’ve able to get DC down to -110 dBm (as measured by spectrum analyzer), while manual adjustments give another -10 dB decrease (uo tp -120 dBm). For now, LimeSDR can’t even measure such low signal to go down from -110 to -120 automatically (some filtering required, I’ll do it later). So, in future, I think, I’ll be able to suppress DC to almost undetectable levels.
Haven’t tested and implemented gain and phase calibrations yet.

The point is, stock calibration doesn’t calibrate properly, even if it’s technically possible. Even using internal loopback.

Another point is that, stock calibration always use internal loopback, while technically nothing prevents it from switching to external on-board.

Good to hear that you managed to craft your own calibration procedure which actually works :wink: Hope you’ll share it when you’ll finish it, coz frankly, I gotta feeling that present automatic calibration is a mess. For some reason noone from Lime is looking into it, though (it may be SoapyLMS7 issue - recently it has been discovered that it does not make use of some calibration functions, as noted here).


Did you look at RX calibration routine results in similar manner? Don’t you have impression that the situation with RX calibration is close to TX one? I try to use LimeSDR RX to measure just empty carrier level from external generator, and I can’t succeed because of unstability of measured level which as I suspect might derive from IQ imbalance after calibration. DC level isn’t eliminated after RX calibration too, it’s why I’m interested if you checked RX also.

Can you share your calibration routine code? Having read quite a number of topics on calibration problems here (including ones with broken cal caching system) remaining without the answers, it looks like it’s not on Lime’s priority list. Which to be honest looks very strange to me…