Max gain - LimeSDR Mini

Hello, trying to set the max gain on my LimeSDR mini but can’t go over 0.8421052631578947. Is that normal?
I’m using the native library.

print(cyLimeLib.set_normalized_gain(1))
print(cyLimeLib.get_normalized_gain())

0.8421052631578947

(This is for TX)

Changing centre frequency or LPF band wont bring different results

This is the code:
import cyLimeLib

cyLimeLib.open()
cyLimeLib.init()

cyLimeLib.set_normalized_gain(1)
cyLimeLib.disable_all_channels()
cyLimeLib.set_center_frequency(2000e6)
cyLimeLib.set_lpf_bandwidth(20e6)

print("LPF Band", cyLimeLib.get_lpf_bandwidth())
print("Centre Freq", cyLimeLib.get_center_frequency())
print(cyLimeLib.get_normalized_gain())

cyLimeLib.close()

Output:
Reference clock 40.00 MHz
Filter calibrated. Filter order-4th, filter bandwidth set to 20 MHz.Real pole 1st order filter set to 2.5 MHz. Preemphasis filter not active
TX LPF configured
LPF Band 20000000.0
Centre Freq 2000000000.0
0.8421052631578947

Hi @BlackF,

Let us check.

Hey @BlackF,

I don’t see the result of the return value from the set_normalized_gain(1) function which could and, I assume, in your case actually failed. Getting normalized gain value of ~0.84 would mean TRFPAD gain was set to 52(max value) and TBBIAMP gain failed.

Can you post results of LimeQuickTest ?

Sometime between 19.04 and 20.01 releses TX functions are changed. I had fully working TX code with 19.04 dll but after 20.01 output signal level is lower. Also TX NCO settings have no effect and TX LPF bandwidth is broken (increased BW decreases output level!). Positive side is reduced CPU usage.
My OS is Win10, hw LimeSDR-USB and frequency is 144MHz. I did not notice any change in function definitions. Any idea?

@Garmus may be able to advise.

Hello,

This is the output of my script

Python script

import cyLimeLib
cyLimeLib.open()
cyLimeLib.init()

print(cyLimeLib.set_normalized_gain(1))
cyLimeLib.disable_all_channels()
cyLimeLib.set_center_frequency(2000e6)
cyLimeLib.set_lpf_bandwidth(20e6)
print("LPF Band", cyLimeLib.get_lpf_bandwidth())
print("Centre Freq", cyLimeLib.get_center_frequency())
print(cyLimeLib.get_normalized_gain())

cyLimeLib.close()

Output

Reference clock 40.00 MHz
0
Filter calibrated. Filter order-4th, filter bandwidth set to 20 MHz.Real pole 1st order filter set to 2.5 MHz. Preemphasis filter not active
TX LPF configured
LPF Band 20000000.0
Centre Freq 2000000000.0
0.8421052631578947

The set_normalized_gain definition

 cpdef int set_normalized_gain(float_type gain):
     return LMS_SetNormalizedGain(_c_device, IS_TX, CHANNEL, gain)

LimeQuickTest

[ TESTING STARTED ]
->Start time: Fri May 29 02:10:19 2020

->Device: LimeSDR Mini, media=USB 3.0, module=FT601, addr=24607:1027, serial=1D588F01B6A2F2
  Serial Number: 1D588F01B6A2F2

[ Clock Network Test ]
->REF clock test
  Test results: 49150; 62347; 10008 - PASSED
->VCTCXO test
  Results : 6711054 (min); 6711215 (max) - PASSED
->Clock Network Test PASSED

[ FPGA EEPROM Test ]
->Read EEPROM
->Read data: 13 0A 16 13 0A 16 02
->FPGA EEPROM Test PASSED

[ LMS7002M Test ]
->Perform Registers Test
->External Reset line test
  Reg 0x20: Write value 0xFFFD, Read value 0xFFFD
  Reg 0x20: value after reset 0x0FFFF
->LMS7002M Test PASSED

[ RF Loopback Test ]
->Configure LMS
->Run Tests (TX_2 -> LNA_W):
  CH0 (SXR=1000.0MHz, SXT=1005.0MHz): Result:(-12.8 dBFS, 5.00 MHz) - PASSED
->Run Tests (TX_1 -> LNA_H):
  CH0 (SXR=2100.0MHz, SXT=2105.0MHz): Result:(-13.7 dBFS, 5.00 MHz) - PASSED
->RF Loopback Test PASSED

=> Board tests PASSED <=

Elapsed time: 2.47 seconds

Could you please test with the new 20.07.1 release and let me know if you still encounter these issues.

Hello,
The new 20.07.2 version does help with the intermittent, temperature sensitive LimeQuickTests.

As for my initial post, I still get
0.8421052631578947

set_normalized_gain(1) returns “0” as per my last comment above.

I tried running some more test with the LMS_GetGaindB call. Similar results.

Python Code (configured for TX):

#! /usr/bin/python3

import cyLimeLib

cyLimeLib.open()
cyLimeLib.init()

cyLimeLib.disable_all_channels()
cyLimeLib.set_center_frequency(2000e6)
cyLimeLib.set_lpf_bandwidth(20e6)

print("LPF Band", cyLimeLib.get_lpf_bandwidth())
print("Centre Freq", cyLimeLib.get_center_frequency())

print("Setting new normalised GAIN to 1, status: ", cyLimeLib.set_normalized_gain(1))
print("Printing Current Normalised GAIN: ", cyLimeLib.get_normalized_gain())
print("Printing Current GAINdB value: ", cyLimeLib.get_gaindB())
print("Setting new GAINdB to 70, status: ", cyLimeLib.set_gaindB(70))
print("Printing Current GAINdB value: ", cyLimeLib.get_gaindB())


cyLimeLib.close()

Output

Reference clock 40.00 MHz
Filter calibrated. Filter order-4th, filter bandwidth set to 20 MHz.Real pole 1st order filter set to 2.5 MHz. Preemphasis filter not active
TX LPF configured
LPF Band 20000000.0
Centre Freq 2000000000.0
Setting new normalised GAIN to 1, status:  0
Printing Current Normalised GAIN:  0.8421052631578947
Printing Current GAINdB value:  64
Setting new GAINdB to 70, status:  0
Printing Current GAINdB value:  64

@Garmus could you take a look, please.

Just to assist with the investigation, I added some extra logging to the following functions and recompiled master

-> LMS7002M::SetTRFPAD_dB
-> LMS7002M::GetTRFPAD_dB
-> LMS7002M::SetTBBIAMP_dB
-> LMS7002M::GetTBBIAMP_dB

As followed :

LMS7002M::SetTRFPAD_dB

int LMS7002M::SetTRFPAD_dB(const float_type value)
{
    const double pmax = 52;
    int loss_int = (pmax-value)+0.5;

    //different scaling realm
    if (loss_int > 10) loss_int = (loss_int+10)/2;

    //clip
    if (loss_int > 31) loss_int = 31;
    if (loss_int < 0) loss_int = 0;

    int ret = 0;
    ret |= this->Modify_SPI_Reg_bits(LMS7param(LOSS_LIN_TXPAD_TRF), loss_int);
    ret |= this->Modify_SPI_Reg_bits(LMS7param(LOSS_MAIN_TXPAD_TRF), loss_int);

    Log(LOG_INFO, "SetTRFPAD_dB: loss_int: %i", loss_int);

    return ret;
}

LMS7002M::GetTRFPAD_dB

float_type LMS7002M::GetTRFPAD_dB(void)
{
    const double pmax = 52;
    auto loss_int = this->Get_SPI_Reg_bits(LMS7param(LOSS_LIN_TXPAD_TRF));
    Log(LOG_INFO, "GetTRFPAD_dB: loss_int: %g return: %g return2: %g", loss_int, pmax-10-2*(loss_int-10), pmax-loss_int);
    if (loss_int > 10) return pmax-10-2*(loss_int-10);
    return pmax-loss_int;
}

LMS7002M::SetTBBIAMP_dB

int LMS7002M::SetTBBIAMP_dB(const float_type gain)
{
    int ind = this->GetActiveChannelIndex()%2;
    if (opt_gain_tbb[ind] <= 0)
    {
        if (CalibrateTxGain(0,nullptr)!=0) //set optimal BB gain
            return -1;
        if (std::fabs(gain) < 0.2) // optimal gain = ~0dB
            return 0;
    }

    int g_iamp = (float_type)opt_gain_tbb[ind]*pow(10.0,gain/20.0)+0.4;
    Log(LOG_INFO, "SetTBBIAMP_dB: g_iamp: %i opt_gain_tbb[ind]: %g gain: %g", g_iamp, opt_gain_tbb[ind],gain);
    int modify_return = Modify_SPI_Reg_bits(LMS7param(CG_IAMP_TBB),g_iamp > 63 ? 63 : g_iamp<1 ? 1 :g_iamp , true);
    Log(LOG_INFO, "SetTBBIAMP_dB: Modify_SPI_Reg_bits: %i", modify_return);

    return 0;
}

LMS7002M::GetTBBIAMP_dB

float_type LMS7002M::GetTBBIAMP_dB(void)
{
    int g_current = Get_SPI_Reg_bits(LMS7param(CG_IAMP_TBB),true);
    int ind = this->GetActiveChannelIndex()%2;

    if (opt_gain_tbb[ind] <= 0)
    {
        if (CalibrateTxGain(0,nullptr)!=0)
            return 0.0;
        Modify_SPI_Reg_bits(LMS7param(CG_IAMP_TBB),g_current, true); //restore
    }
    Log(LOG_INFO, "GetTBBIAMP_dB: g_current: %g", (float_type)g_current);
    Log(LOG_INFO, "GetTBBIAMP_dB: opt_gain_tbb: %g", (float_type)opt_gain_tbb[ind]);
    Log(LOG_INFO, "GetTBBIAMP_dB: return: %g", 20.0*log10((float_type)g_current / (float_type) opt_gain_tbb[ind]));

    return 20.0*log10((float_type)g_current / (float_type) opt_gain_tbb[ind]);
}

When I run set_normalized_gain(0) I get:

SetTRFPAD_dB: loss_int: 31
GetTRFPAD_dB: loss_int: 0 return: 21 return2: 0
SetTBBIAMP_dB: g_iamp: 16 opt_gain_tbb[ind]: -12 gain: 63
SetTBBIAMP_dB: Modify_SPI_Reg_bits: 0

When I run get_normalized_gain() I get:

GetTRFPAD_dB: loss_int: 0 return: 21 return2: 0
GetTBBIAMP_dB: g_current: 16
GetTBBIAMP_dB: opt_gain_tbb: 63
GetTBBIAMP_dB: return: -11.9044

When I run set_normalized_gain(1) I get:

SetTRFPAD_dB: loss_int: 0
GetTRFPAD_dB: loss_int: 62 return: 52 return2: 62
SetTBBIAMP_dB: g_iamp: 251 opt_gain_tbb[ind]: 12 gain: 63
SetTBBIAMP_dB: Modify_SPI_Reg_bits: 0

When I run get_normalized_gain() I get:

GetTRFPAD_dB: loss_int: 62 return: 52 return2: 62
GetTBBIAMP_dB: g_current: 63
GetTBBIAMP_dB: opt_gain_tbb: 63
GetTBBIAMP_dB: return: 0

I tried emulating the same test with SoapySDR but Soapy seems to work as expected.

Soapy python code:

print("Setting TX gain to maximum 64")
sdr.setGain(SOAPY_SDR_TX,0, 64);
print("Gain after ", sdr.getGain(SOAPY_SDR_TX,0))
print("PAD: ", sdr.getGain(SOAPY_SDR_TX,0, "PAD"))
print("IAMP: ", sdr.getGain(SOAPY_SDR_TX,0, "IAMP"))

Output:

Setting TX gain to maximum 64
[INFO] SetTRFPAD_dB: loss_int: 0
[INFO] GetTRFPAD_dB: loss_int: 62 return: 52 return2: 62
[INFO] SetTBBIAMP_dB: g_iamp: 8 opt_gain_tbb[ind]: 12 gain: 2
[INFO] SetTBBIAMP_dB: Modify_SPI_Reg_bits: 0
[INFO] GetTRFPAD_dB: loss_int: 62 return: 52 return2: 62
[INFO] GetTBBIAMP_dB: g_current: 8
[INFO] GetTBBIAMP_dB: opt_gain_tbb: 2
[INFO] GetTBBIAMP_dB: return: 12.0412
[INFO] GetTRFPAD_dB: loss_int: 62 return: 52 return2: 62
[INFO] GetTBBIAMP_dB: g_current: 8
[INFO] GetTBBIAMP_dB: opt_gain_tbb: 2
[INFO] GetTBBIAMP_dB: return: 12.0412
Gain after  64.04119982655925
[INFO] GetTRFPAD_dB: loss_int: 62 return: 52 return2: 62
PAD:  52.0
[INFO] GetTBBIAMP_dB: g_current: 8
[INFO] GetTBBIAMP_dB: opt_gain_tbb: 2
[INFO] GetTBBIAMP_dB: return: 12.0412
IAMP:  12.041199826559248

I get a correct value of 12 for IAMP

Hope this helps even a bit

The discrepancy between the native library and SoapySDR seems to happen in LMS7002M::CalibrateTxGain

In the native library the value of “opt_gain_tbb[ind]” is 63 while in soapy is 2. This seems to come from the while(GetRSSI() < 0x7FFF) loop.

I don’t think I can go any deeper than that

Edit:
FOUND THE ISSUE!!!
I was forgetting to call LMS_EnableChannel after the initialisation and it looks like this is messing the TX calibration. After I did, I get a correct value of 1 !

Problem was solved before latest release. I was experiment with timeout value in receive callback function and I forgot about it. Exotic problems started with 20.01 but after I return to default value of 1000mS everything return back to normal.
p.s. not sure am I first to use LimeSDR for EME but yesturday managed to make two way contact with RX1AS on Moonrise (distance was about 1700km).
My setup:

  • radio: LimeSDR-USB, RX1_L unmodifed input, TX0_1 output
  • antena: 11el 3WL YU7EF design, rotated by Azimuth only
  • LNA: PGA103+ 0.7dB NF
  • PA: first stage RA30H1317M1 (30W module), final stage SSPA BLF578 (about 500W output)
  • software: LimeSDR# x64 + LimeSuite 20.07 + MSHV 2.42

LimeSDR worked all day without problem.

1 Like