LMS_SetLPFBW strange behavior

As I understand LPFBW set an analog filter LPF, which should be decorelated from the sample rate.

If I call this function, it seems to change the samplerate.
See result and the extract code.

SR=2000000.000000 Decimation=16
SR 2000000.000000 DAC 32000000.000000
Valid BP=1400000.000000-130000000.000000 by 1.000000 step
RX LPF configured
Bandwidth=2000000.000000
SR 798125.011580 DAC 25540000.370571

if (LMS_SetSampleRate(device, m_sr, OVERSAMPLE) != 0)
		{
			fprintf(stderr,"SR Not Set with decimation %d\n",OVERSAMPLE);
		}
        else
            fprintf(stderr,     "SR=%f Decimation=%d\n",m_sr, OVERSAMPLE);
        float_type HostSR, DacSR;

        LMS_GetSampleRate(device, LMS_CH_RX, 0, &HostSR, &DacSR);
		fprintf(stderr,"SR %f DAC %f\n", HostSR, DacSR); 

        lms_range_t RangeBP;       
        LMS_GetLPFBWRange(device,LMS_CH_RX,&RangeBP);	
        fprintf(stderr, "Valid BP=%f-%f by %f step\n", RangeBP.min, RangeBP.max, RangeBP.step);	

        m_Bandwidth=m_sr;
		m_Bandwidth = (m_sr  < RangeBP.min) ? RangeBP.min : m_Bandwidth;
        m_Bandwidth = (m_sr  > RangeBP.max) ? RangeBP.max : m_Bandwidth;
		LMS_SetLPFBW(device, LMS_CH_RX, 0, m_Bandwidth);

      
        float_type BandWidth;
        LMS_GetLPFBW(device,LMS_CH_RX,0,&BandWidth);
        fprintf(stderr,"Bandwidth=%f\n",BandWidth);		

      
		LMS_GetSampleRate(device, LMS_CH_RX, 0, &HostSR, &DacSR);
		fprintf(stderr,"SR %f DAC %f\n", HostSR, DacSR);

check if the LMS_SetLPFBW is successfull or fails. There was issue where upon failure not all parameters were restored, it’s fixed in CalibrationUpdate branch, but not yet verified for release.

Return of LMS_SetLPFBW is sucessfull (2MHZ bandwidth).
However if failed with the Range minimum (1.4MHZ):

Valid BP=1400000.000000-130000000.000000 by 1.000000 step
MCU working too long 127
SetLPF with 1400000 bandwidth failed