What about IQ imbalance - DC offset calibration?

I dont know …

But i think there is a convention that a certain signal level into a receiver should show a certain VU meter level.

And that level display would be calibrated in computer software … I think that the available software are not matured enough to have that – maybe SdrRadio, but i have always felt that a signal level is some what irrelevant. if it can be heard , good. If not, then improve the signal to noise ratio or gain of the antenna.

-73 dBm is equal to S9 below 30 MHz (the HF world)
-93 dBm is equal to S9 on 6M and up.

I ran this yesterday after discovering it:

./LimeSuiteCalibrate.py --freqStart 1e6 --freqStop 2e9

and later my cell phones would not register with known working sim! It worked previously so backed out ~/.limesuite/LMS7002M_cache_values.db and created a new one with LimeUtil --make and then it worked again. Stock installation of OpenBTS, only changed the GSM.Radio.C0 value, and Calibrate breaks it.

1 Like

@cswiger thanks for spotting this! Looks like the LimeSuiteCalibrate method of doing a sweep to pre-populate the DB may now be incompatible / sub-optimal. We’re looking into this and plan to have a verified, supported solution to doing sweeps and advance populating the database soon.

Is there any description what exactly this calibration procedure performs? It seems that many users here do not understand the goal and methods involved.

Im wondering, once the calibration routine is done, does the values used for all three LNA paths … the calibration is done for internal paths, but what about external parts?

I would think that calibration is neaded to compensate for phase/amplitude error from I/Q transformer down into the ADC/DAC … Would it not be better to have a coxial jumper from Rx1_H to Tx1_W etc to account for all the parts?

When i look at a signal around 3.4Ghz the display (gqrx) does not look like the calibration is being applied or it can be improved apon.

I don’t know where the shorting path is that is incorporated into LimeSDR is.

Any news to report on this?

I think sub-optimal is optimistic. But it’s possible that I am not clear on what to expect. Or in other words my expectations are incorrect. But I can only blame myself since no one else set them.

However, after running LimeSuiteCalibrate.py I could no longer rx where I was rxing fine before I ran it. I just deleted the ~/.limesuite/LMS7002M_cache_values.db and I guess soapylms must have recreated it. Now I can rx again.

Hi @hTo137, hopefully @joshblum and/or @Zack can provide an update.

are you saying that there is a crash … or gqrx or what ever, does no longer like your settings?

If the computer has no I/Q values to correct for … it seems reasonable that it also has a lowered CPU load would allow a Rx to operate.

I’m stil wondering how to do manually do a I/Q calibration in LimeSuiteGUI that would be amended to the values created/saved by the automatic I/Q python program …

The values generated for frequency’s above 2ghz are not all that great and should be improved apon.

Here are the steps to reproduce what I reported:

  1. rm ~/.limesuite/LMS7002M_cache_values.db
  2. rx and see lots of the data I want to see
  3. run LimeSuiteCalibrate.py
  4. rx and see much much less data and sometimes none at all
  5. rm ~/.limesuite/LMS7002M_cache_values.db
  6. rx and see lots of the data I want to see

I don’t understand this.

The lack of I/Q correction values means that your cpu(s) have less work to do … FFT values are not combined in the LimeSDR-USB’ fpga.

What you discribe sounds like what is normal – combination of gain settings and lack im image rejection (due to no I/Q correction values) – this always leands its self to a higher noise floor which gives the software more to show on the display.

I hope you are using shielding for your lime and bpf for the area of Rx …

The lime ADC has a ~70db therotical dynamic sampeling range … the analogue gain is best adjusted to maximise this… you can get alot of false signals (images or mixture of sheilding/Lo harmonics) with no input filtering, shielding or out of range gain settings.

@Kc7noa

What you’re saying then is that the LimeSuiteCalibrate.py provides I/Q correction values and stores them in the sqlite db?

Those values are then used by the LimeSDR and those corrections are making it so that I can no longer rx because the LimeSDR is now more finely tuned?

But if I had a BPF (Band Pass Filter) and better shielding I would probably be able to rx after calibration?

Some where … that is used … least it looks as though gqrx/soapysdr does …

I dont think you need shielding as much as a bandpass filter or a low pass – hard to decifer from my location if you have FM station mixing or AM broadcast … or a combination of both …

AND … i cant discern if your getting some configuration issue that’s being caught and mitigated in an unknown way … it helps alot to run gqrx from terminal so you can see most communication between gqrx and soapysdr …

So im 75% confident that with a filter for the frequency your trying to Rx – and adjustments of the 3 gain levels you should achieve your goal.

Far as I can tell we’re not even talking about the same thing.

I experienced a similar problem. I removed that ~/.limesuite/LMS7002M_cache_values.db
and it reappeared when I ran GRC. I guess that a new one is created if it doesn’t exist by SoapyLMS7.

So are you saying that the GRC created a default corrections table (with no corrections) ???

and adjustments of the 3 gain levels

Any docs on adjusting the 3 gain levels? What acceptable ranges are? What you are looking for when adjust each of the 3?

Also, I think I am seeing RX quality issues with my OAI install. I see it sourcing the calibration db.

Connecting to device: LimeSDR-USB, media=USB 3.0, module=STREAM, addr=1d50:6108, serial=0009060B00472726
[INFO] LMS7002M cache /home/enb/.limesuite/LMS7002M_cache_values.db

Should I have run the calibration script? I am using a bpf/duplexer for the band I’m using.

Not that im aware of …

LNA gain is kinda dependent on frequency … below 10Mhz best to be kept to 3db or less up around 3ghz i almost max LNA… TIA around 12db ( adjust by sight of noise and noise floor – same with LNA) then the last is prety much a digital gain so any place you want to adjust the min/max for the 70db dynamic range of the ADC/DAC …

I try to adjust all 3 at a place that each barely raises the noise floor …


Not sure what AIO is …

The I/Q calibration will lower the observed noise floor … but not by much … with a uncalibrated I/Q there are image signals of the 90Deg and 270deg probable outcomes of the FFT that also is shown – and added or subtracted against the FFT of the 180deg/0deg outcome …

I have noticed that there are Large spikes at certin intervals … like when i have my system set for 250000 samples per/sec at 6Mhz and 12Mhz 18 … on up … and change the sample rate to 3000000 and they move elsewhere …not sure if its a bug but i need to investigate more and try to find out if its something with gqrx or soapysdr – i dont have a working windows environment …

So … how does some one do a manual calibration at a single frequency?