srsRAN with LimeSDR mini 2.0 has poor, unespected, RX sensitivity

I’m characterizing the RX sensitivity of an eNodeB based on srsRAN and LimeSDR mini 2.0.
I have a RFFE which includes a SAW duplexer for UL and DL isolation. I use a reference UE (evk of LTE module) and a step attenuator in order to tune the path loss for the 95% of throughput drop.

By comparing the eNodeB (UL) sensitivity achieved with LimeSDR mini 2.0 with the one achieved with another SDR, I can measure a sensitivity reduction of about 10dB. The srsRAN settings are the same (ie: Band 5 850MHz, 25RB (5MHz), rx_gain=40) and the TX power level (DL) is adjusted at the same level.
The NF estimation is around 15dB which is much higher than the 2dB of the LMS7002M transceiver.
I’m wondering if there is a particular setting in srsRAN enb.conf file I have to apply in the case of LimeSDR mini 2.0. Maybe the default IF filtering is not optimized for Lime?

Thanks in advance for your support

Firstly, which version of srsRAN are you using? There is a fork with native LMS API integration, but I must admit that I’ve not tested it with LimeSDR Mini 2.0.

For the associated documentation, see:

https://librecellular.org/user/software

I’m not sure if the gain settings will translate to the same thing across SDRs. Also not sure which RX port you are using, LNAH or LNAW.

It’s possible. Tagging @ricardas and @Karolis for further comment.

Port are correctly setted automatically ( same results by forcing LNAW for Band 5)

I’m using version 22.04-r0.commit f2dff0b

I’d like to stick with it if the updates aren’t related to this issue, thanks

I’m afraid we’ve not tested with stock srsRAN and using SoapySDR, so can’t help. We can only really provide any sort of support on the fork with native LMS API integration.