But i think there is a convention that a certain signal level into a receiver should show a certain VU meter level.
And that level display would be calibrated in computer software … I think that the available software are not matured enough to have that – maybe SdrRadio, but i have always felt that a signal level is some what irrelevant. if it can be heard , good. If not, then improve the signal to noise ratio or gain of the antenna.
and later my cell phones would not register with known working sim! It worked previously so backed out ~/.limesuite/LMS7002M_cache_values.db and created a new one with LimeUtil --make and then it worked again. Stock installation of OpenBTS, only changed the GSM.Radio.C0 value, and Calibrate breaks it.
@cswiger thanks for spotting this! Looks like the LimeSuiteCalibrate method of doing a sweep to pre-populate the DB may now be incompatible / sub-optimal. We’re looking into this and plan to have a verified, supported solution to doing sweeps and advance populating the database soon.
Is there any description what exactly this calibration procedure performs? It seems that many users here do not understand the goal and methods involved.
Im wondering, once the calibration routine is done, does the values used for all three LNA paths … the calibration is done for internal paths, but what about external parts?
I would think that calibration is neaded to compensate for phase/amplitude error from I/Q transformer down into the ADC/DAC … Would it not be better to have a coxial jumper from Rx1_H to Tx1_W etc to account for all the parts?
When i look at a signal around 3.4Ghz the display (gqrx) does not look like the calibration is being applied or it can be improved apon.
I don’t know where the shorting path is that is incorporated into LimeSDR is.
I think sub-optimal is optimistic. But it’s possible that I am not clear on what to expect. Or in other words my expectations are incorrect. But I can only blame myself since no one else set them.
However, after running LimeSuiteCalibrate.py I could no longer rx where I was rxing fine before I ran it. I just deleted the ~/.limesuite/LMS7002M_cache_values.db and I guess soapylms must have recreated it. Now I can rx again.
I’m stil wondering how to do manually do a I/Q calibration in LimeSuiteGUI that would be amended to the values created/saved by the automatic I/Q python program …
The values generated for frequency’s above 2ghz are not all that great and should be improved apon.
The lack of I/Q correction values means that your cpu(s) have less work to do … FFT values are not combined in the LimeSDR-USB’ fpga.
What you discribe sounds like what is normal – combination of gain settings and lack im image rejection (due to no I/Q correction values) – this always leands its self to a higher noise floor which gives the software more to show on the display.
I hope you are using shielding for your lime and bpf for the area of Rx …
The lime ADC has a ~70db therotical dynamic sampeling range … the analogue gain is best adjusted to maximise this… you can get alot of false signals (images or mixture of sheilding/Lo harmonics) with no input filtering, shielding or out of range gain settings.
Some where … that is used … least it looks as though gqrx/soapysdr does …
I dont think you need shielding as much as a bandpass filter or a low pass – hard to decifer from my location if you have FM station mixing or AM broadcast … or a combination of both …
AND … i cant discern if your getting some configuration issue that’s being caught and mitigated in an unknown way … it helps alot to run gqrx from terminal so you can see most communication between gqrx and soapysdr …
So im 75% confident that with a filter for the frequency your trying to Rx – and adjustments of the 3 gain levels you should achieve your goal.
I experienced a similar problem. I removed that ~/.limesuite/LMS7002M_cache_values.db
and it reappeared when I ran GRC. I guess that a new one is created if it doesn’t exist by SoapyLMS7.
LNA gain is kinda dependent on frequency … below 10Mhz best to be kept to 3db or less up around 3ghz i almost max LNA… TIA around 12db ( adjust by sight of noise and noise floor – same with LNA) then the last is prety much a digital gain so any place you want to adjust the min/max for the 70db dynamic range of the ADC/DAC …
I try to adjust all 3 at a place that each barely raises the noise floor …
Not sure what AIO is …
The I/Q calibration will lower the observed noise floor … but not by much … with a uncalibrated I/Q there are image signals of the 90Deg and 270deg probable outcomes of the FFT that also is shown – and added or subtracted against the FFT of the 180deg/0deg outcome …
I have noticed that there are Large spikes at certin intervals … like when i have my system set for 250000 samples per/sec at 6Mhz and 12Mhz 18 … on up … and change the sample rate to 3000000 and they move elsewhere …not sure if its a bug but i need to investigate more and try to find out if its something with gqrx or soapysdr – i dont have a working windows environment …