After getting my LimeSDR in a first step I have experimented a little bit with the Lime Suite GUI using the “LimeSDR-USB Quick Test” document. When generating a test signal as described in section 5 the signal looks totally uncalibrated with a large carrier and image (just like the picture in the document). So, I tried with the calibration tab to use TX calibration with the MCU with no improvement. After each calibration the calibration was reported successful, the TX Gain, Phase and DC corrections were updated but the results showed no big improvement. I managed to manually get a very good carrier – image rejection but after pressing TX calibration it was again as bad as before.
In the figure below if used the loopback test to illustrate the effect. The Red signal was shifted in the TxTSP tab by 3 MHz in order to make the image and the carrier visible. The image rejection is only about 25 dB. In the Blue signal the image and carrier overlap with the intended signal and create self-interference. In addition, there is a coupling between both Signals that creates even more interference (at least using the loopback path) Between +3 and +5 Mhz the Coupling of the Red Signal to the Blue signal increases the noise of the Blue signal.
RX calibration resulted in “TuneVCO(SXR) – VCO to high”. Calibrate all worked. I wonder why because I assume Calibrate all includes Calibrate RX.
Using Calibration with External loopback resulted in the message “external loopback calibration requires ENABLE_CALIBRATION USING FFT”.
In the options of the Lime Suite GUI it is possible to switch on-off “Cache Calibration Values” ???
Regarding the calibration there are a lot of questions and I would expect much more support of Lime here. Clearly it is not effective if we all try to reverse engineer the calibration by reading code and looking at the effects of various switches in the Lime Suite GUI. In addition, the MCU code is propably not available at all.
The main questions are the following:
Is there any documentation about the implementation of the DC offset and IQ imbalance calibration apart from the 1 page without much detail in the LMS7002M data sheet?
Do we need to enable special settings (loop – LO offset – test signal) before pushing the calibrate button or does the mcu take care of that?
Does the caching of calibration values work? Stored in order to be available after a power off -power on cycle? How many calibrations can be stored in the cache? Is there a program for making an initial calibration and storing calibrations in the cache with regular frequency intervals? Can I store manual calibrations in the cache? What happens if I set an LO frequency that is not calibrated yet. (TX without calibration, - make new calibration – Interpolation of available Calibrations)?
How do I “ENABLE_CALIBRATION USING FFT”. Is it possible to achieve here the 50-90 dB image rejection that is state of the art?
Is the use of calibration transparent for external programs like SDR Console - Pothos – GNU Radio – GQRX, do they need extra code or don’t they use calibration at all?
If anybody can give some answers I would apricate it!
Thanks Andrew
No, I have read most of the available documentation but i don’t remember reading about this. I will have a look at it. If it is essential for generating TX signals with better quality, then maybe it should be mentioned in the LimeSDR USB Quick test document
I just received my LimeSDR and still have to setup everything but I’m collecting some info to try and start in the best possible way.
I will mostly run FFTs for amateur radio astronomy usage and I will certainly use LimeSuiteCalibrate before starting.
Is it necessary to purchase a calibrated noise source to ensure the FFT output values are meaningful ?
or there’s simply no need for calibration from this point of view ?
Im not sure what you mean … i think the program is calibrateing the I/Q values for image rejection of both Rx and Tx … the Tx path is internally coupled to the Rx path and then the calibration is done.
Could someone please shed some light on all the questions asked in the first post? For example question 5 about how these calibration settings are persisted and used by other programs like PothosSDR.
6 . How can calibration be performed in windows? Is that script supported and are calibration settings persisted so that Pothos and SDRConsole can use them after?
I believe I can say something about this. I have managed to start the script in Windows 10. It seems it requires Python 2.7. With Python 3.5 I was getting modules import error.
In addition I was missing scipy from my Python 2.7 distro so I have installed scipy using pip and scipy whl file available for download from the internet.
Now I did “python .\LimeSuiteCalibrate.py --freqStart 500000 --freqStop 3800000000” and the script is working now. I have not checked in details what it does but it seems to scan the whole frequency range in 500KHz steps. It is working already 30 min and I it is up to 500MHz now. So it will work for hours I think.
Im afraid that i didint … but i did see images before and now i dont …
I wasnt really looking for them before and im still not “LOOKING” for them since im used to ignoring them with my UHFSDR. I keep changing things with my UHFSDR and upsetting its I/Q balance. I cant change any internal stuff with the LimeSDR so it should stay put. It might be possible to make the image rejection better in code … I have heard of others getting a better single freq image rejection by doing a manual adjustment … I cant say i know how to do that in the software-- iv not looked for it.
Hi @M0GLO
Thank you for sharing the link, I will certainly have a look.
I’m always looking around for cheaper alternatives and I’ve collected a small set of circuit diagrams from various sources/books.
All of them actually need calibration.
This is one from the book “Elementi di tecnica radioastronomica”, 1977, by Gianfranco Sinigaglia I4BBE:
Hi @Kc7noa
Sorry, I’m not a field expert. I realize that this thread is about a calibration just on I/Q imbalance. So I was hoping to enlarge it also to “signal level calibration”, if needed.
Let me try asking in a different way: should I compare a waterfall with dB levels coming from the LimeSDR with the waterfall of a commercial receiver (ICOM, YAESU, … ) could I expect the same dB levels represented ?
Or should I perform some calibration setting some offset value in LimeSDR settings?
If this is the case, ever since I don’t have another radio with calibrated output available, I should probably consider calibration using a calibrated noise source at the antenna input of the LimeSDR.
Am I understanding this correctly ?
Does the LimeSDR have some internal calibration source for this purpose?
thank you for pointing me in a correct direction,
mario