Output power limesdr

Hi Karolis,

I am new to LimeSDR. I was wondering if you could guide me as to how you created the max output power graph in the above thread? do you have formulas in excel that you could share?

I am trying to understand what the max output power is for limesdr. Using an anritsu spectrum analyzer I have been able to receive a -33 dBm signal. I saw that other users using an analyzer received about the same received power -35 dBm in their case (from googling this topic). I know the specs have 0 dBm and in other docs +10 dBm but I don’t think those are achievable. without an amplifier and high gain antennas. Could you share the *.ini file if you have it that you use to configure limeSuite GUI if that is the tool you used to transmit the signals recorded in the graph?

Thank you in advance for any feedback you may provide,

Joel

Hello Joel,

To measure the max output power for LimeSDR, please use the state file (provided in the link below) for LimeSuiteGUI.

TX output power measurement state file

Please note:

  1. State file is configured for A channel operation only;
  2. You have to change the active output band selection in TRF tab (enable either Band 1 or Band 2);
  3. When you load the .ini file, please calibrate all PLLs (press calculate in CLKGEN, SXR and SXT tabs);
  4. When you load the .ini file, please go to TxTSP tab (when A channel is selected), write “7fff” to the field called “DC_REG:” and press “Load to DC I” and “Load to DC Q”.

You can then get the frequency response by sweeping the SXT value in the desired range.

Hope this helps,
Karolis

Hi Karolis,

Thank you so much!! I was able to get real close to your transmit output power chart. but I am baffled by an increasing delta in the power received as frequency decreases between your chart and mine. I put your output power series in excel for tx port 1 and compared it to mine and the results are below:

My setup is the limeSDR with your ini file, I did some minor tweaks to increase the power recieved by a few db in my anritsu spectrum analyzer. The tewaks were minor in that I was only able to get about 2 dB gain over your ini file and hence I got slightly closer to your power received. I changed things like the Bias at gate of mixer values to 31 in TRF tab in limeSuite.

I have an lmr400 cable 25 ft length connected from limesdr to anritsu analyzer so about a 1 dB cable loss. There is a ruggedized adapter at the input port of the anlyzer with about .1 dB loss, there is also a converter sma to n connector from limesdr to lmr cable, then I gave myself an extra dB for system loss so 3 in all. the Lmr cable is connected directly to the input port of the anritsu analyzer. I had to do this cause before I was transmitting with a telescopic antenna which had some gain and setting it close to the input port of the anritsu analyzer I was getting some distortion due to a magnet at the bottom of the antennas plus free space loss and other not factored-in variables I just didn’t trust the - 7 dBm max power I was getting.

With the control setup above transmitting via the lmr cable I finally got close results to yours. However, as you can see in the lower frequencies I have a delta with what you get. at 500 MHz our difference is 12 dB, at 1000 MHz it drops to 5 dB, at 2.5 GHz it drops to 3 and finally we are exactly the same from 3.5 GHz and up.

From these readings, I am led to believe that the specsheet max output value of +10dBm is correct if my setup is correct as I maxed out at +10 dBm at 1 GHz. But you were able to record power received values of up to 15 dBm. Is it possible with the max being 10 dBm? I am wondering if in your setup you had an antenna gain or amplifier gain that was causing the higher gain at the lower frequencies? and perhaps it was out of operation range after 3 Ghz and hence we math after that frequency? Or most likely perhaps I am still doing something wrong and not capturing the max power.

Now, I am using the LimeSDR 1.4 version motherboard.

Any feedback you may have is greatly appreciated,

Thanks again,

Joel

Hello Joel,

the setup i used was fairly simple:

  1. Use the .ini file and instruction provided in the earlier post;
  2. Connect LimeSDR to a spectrum analyzer via cable (in the tested case and U.FL > SMA converter was used)
  3. Do simple power measurement.

There were no tweeks on the RF matching side whatsoever - the board was at default configuration, hence the difference is quite strange.

  1. Did you try to measure lower frequency values (your chart only shows down to 500MHz)?
  2. Did you try measuring TX output band 2? Is the difference there similar?

I did not use any external amplifiers or other equipment.

Regards,
Karolis

Hi Karolis,

Thanks again. I played around with the settings in the analyzer and I do get a curve closer to yours now by changing the RBW manually:

Can you confirm what RF bandwidth in MHz you used in the TBB tab? I noticed that when loading the .ini file it doesn’t update this field, the value remains to whatever I set it to prior to loading your file.

I used 5 MHz as a reference then hit the Tune button and with the RBW in the table above I got a curve similar to yours, I also set my span to 5 Mhz in the analyzer GUI. Additionally, I set my reference level to 30 dBm. I also used an input attenuation of 5 dB which got rid of some signal leakage from the limeSDR past the LPF filter of 5 MHz but it didn’t change the channel power level that I was recording. I used the anritsu spectrum analyzer for this. Can you confirm what RBW, reference level and span you used for your recordings?

I borrowed the analyzer and didn’t have time to do the tx output band 2 yet. I will do those when I get it back tomorrow. Thanks again for your feedback which has been very helpful.

Regards,

Joel

Hi Karolis,

I was able to get the analyzer and finish the tx2 power receive chart. It is similar to yours but it also has about the same difference where we are different in the low freqs, meet around the middle then different at the high freqs. Here’s the pic:

image

Hi Karolis,

I wanted to update you on more tests that I did today. I now get almost exactly the same curve you get for transmit power vs frequency by using a span of 20 MHz in the anritsu spectrum analyzer. Since that changes how the channel power is calculated it actually matched yours more closely:
image

actually for frequencies 100 - 500 MHz I changed the span to 40 MHz to match the values you got, it made sense since at those frequencies the bandwidth signal spanned out more than at the high frequencies.

I think since I am in the ball park, I will leave this alone now. However, I would love it if you can confirm your RBW, SPAN, Reference Level and any input attenuation you may have used in your analyzer just to confirm that my setup was correct. Oh and also your bandwidth in the RBB tab, I tuned your .INI signal to 20 MHz. I also went back to using your ini file without making changes to it and that is also how I got the light green curve to match your orange curve in the graph above, the blue one is my previous one which had some differences to yours. As a side note by checking the Bias checkboxes and all the power controls in the LDO tab I was able to increase the output power by about 3 dB on top of what you got with your ini file. So my max output power is at +20 dBm with this settings at the lower frequencies instead of yours at 17 dBm.

Now, I am still baffled as to why the specs for limeSDR has a max out at +10 dBm and we are getting much more than that I plan on making a separate post for the community to clear this doubt out since from the forums I see a lot of users are confused about this.

Thank you very much for all the help you provided!!!

Joel

Hello Joel,

|Sorry for the delayed response. Glad to hear you managed to duplicate the results. I used a span of 10MHz for the test, where RBW and VBW ratio was 100. This should not affect your results nonetheless, since the signal level should still be far above the noise floor of your analyzer (and no close proximity signals should be present for the need to separate them). Reference level was 20dBm with 30dB attenuation (wrong settings for these could indeed affect your signal level).

RBB tab settings are not important in your case, since these actually control the analog filter for the receive chain (not transmit).

Tweaking bias settings can indeed help to push the output power value higher, but bear in mind, that you are only increasing DC value at this point - at normal operation you would be oversaturated and driving the chain in the non-linear range.

+10dBm is the realistic output power for a modulated signal (not CW) that can be sent without distorting the signal.

Regards,
Karolis

Hi,

Can the TX_power_measurement.ini file be attached to this topic?
It seems gone from Dropbox. If it is attached at least all state is preserved in this forum.

Thanks
Job

Hi Karolis,

I am new to LimeSDR. I am trying to do a similar measurement of the LimeSDR (output power against frequency).
I saw that you upload to this post the “TX_power_measurement.ini” file that you used to make this measurement. Unfortunately this file is no longer available. Can you upload it again?

You also wrote on this post several notes about how you configure some other register/parameter of the LimeSDR in order to reproduce your measurement.
I don’t see that you recommend to perform calibration on the TX path before doing the measurement. For this kind of measurements, calibration is not needed?

Thanks in advance
Mauro

Hello Mauro,

welcome to the LimeSDR community. The removed file(s) will be uploaded to a more permanent location on a server till end of Friday so that it can be accessed for a longer time. I will notify you separately on this. Please note, that the provided state is meant only for maximum CW power measurement of LimeSDR board - it should not be used for communication setups. For that, use default settings and fine tune the RF chip to wanted levels of performance.

Calibration is not needed (and in fact not recommended) in this type of measurement, since TX chain is over-saturated. Under normal working conditions, you would want to use Calibrations (TX side - BB filter bandwidth tune->Gain tune-> DC and IQ imbalance cal.; RX side - BB filter bandwidth tune-> DC and IQ imbalance cal.).

Regards,
Karolis

Hello Mauro,

as I wrote earlier, here is the link to the mentioned files:

http://downloads.myriadrf.org/project/limesuite/helpers/

Regards,
Karolis

Hi Karolis,
Thanks for the information and the help!

Regard
Mauro

@Karolis (or @andrewback ?) : could you explain what is set specifically by using your .ini file in order to get that much output power in LimeSuite? More importantly for me, are these power levels achievable in soapysdr or osmosdr?

With grc or pothos (so osmosdr or soapysdr), when I set the max gain value, I reach max output powers between -33 and -35dBm like @rodrigo7x mentioned (with a cable connection from the LimeSDR USB to an analyzer or another Lime)

@Karolis, @andrewback, no idea on this? Can we achieve a 0dBm output with soapy or osmosdr?

I would have thought so, unless Soapy is somehow restricting the gain range.

More generally would recommend using gr-limesdr blocks instead of gr-osmosdr, since these provides a more direct route via a lower level API, with additional features also.

@andrewback : Max gain in soapy for the lime is set by this method https://github.com/myriadrf/LimeSuite/blob/master/src/API/lms7_device.cpp#L994. Soapy actually calls LMS7_Device::SetGain from here (https://github.com/pothosware/SoapySDR/blob/master/lib/Device.cpp#L276 and here https://github.com/myriadrf/LimeSuite/blob/master/SoapyLMS7/Settings.cpp ) with a value of 64 max for TX.

  • Are these values the max values for the Lime? Could it be that the call to LMS7_Device::SetGain for TX is wrong?

  • If I want to max output without distorting my signal, what should be max PAD and IAMP values? If you also have some link to documentation on the choices of these values, I’d appreciate.

Thanks for the help!

This is one for @Zack or @IgnasJ.

The maximum Tx gain value from soapy is 64. Using it should cause distortion because of saturation if you are utilizing close to full DAC range.

If you are talking about values in soapy, then PAD should be 52 (max), and IAMP should be 0 to guarantee that there is no saturation in IAMP stage at max DAC output. However, for a lot signals IAMP values 3-6 can be used without noticeable distortion if higher gain is needed.