[LMS API ] - LMS_SetSampleRate

Hello everyone, how are you? I’m sorry if this topic doesn’t fit in this category, but i have some (noob) questions about this function of the LMS API (I’m working over the LimeSDR).

From the Documentation:

API_EXPORT int CALL_CONV LMS_SetSampleRate ( lms_device_t * device, float_type rate, size_t oversample )
Set sampling rate for all RX/TX channels. Sample rate is in complex samples (1 sample = I + Q). The function sets sampling rate that is used for data exchange with the host. It also allows to specify higher sampling rate to be used in RF by setting oversampling ratio. Valid oversampling values are 1, 2, 4, 8, 16, 32 or 0 (use device default oversampling value).

  1. What are the meanings of “Sampling Rate” and “Sampling Rate in RF”? I think that the difference may arise depending in which domain (analog or digital) I manage the samples, but I don’t know.
  2. Let’s suppose that I set the Sampling Rate to 2MSPS and oversampling in 1. This means that the sample rate for the I and Q paths is 1MSPS, right? What is the difference between from setting the Sampling Rate to 1MSPS and set the oversampling rate in 2? Is there a decimation process involved? I think this question may be answered with the question (1).
  3. Sampling Rate * oversampling < Maximum bandwidth of the LimeSDR? This question arises from the fact that in the LMS API Quick start guide, in the SingleRX example, The RF Bandwidth is set to 64 MHz:

//Set sample rate to 8 MHz, preferred oversampling in RF 8x. This set sampling rate for all channels.
if (LMS_SetSampleRate(device, 8e6, 8) != 0) error();

To make my question clearer, I have an example. Suppose that I have signal which carrier is 100 MHz and have a bandwidth of 2 MHz (1 MHz in each sideband). Setting the LO frequency to 100MHz, Nyquist says that the minimum sampling rate to succesfully digitize the signal is 2 MHz.
At the moment, I worked with LimeSuiteGUI. So I know that I need to set the CLK_H parameter in CLKGEN to 8 times the (minimum) sampling frequency needed (i.e. 2 MHz * 8 = 16 MHz = sample_needed*(I+Q)*2_Rx_Channel+2_Tx_Channel).
So, using LMS_SetSampleRate, Which values of rate and oversample I need to use?

Thanks in advance!
Lucas.

If CLK_H is 8 times your sample rate, then oversampling should be 2. So in your example ADC runs at 4 MHz (CLK_H/4) and then HW decimator reduces sample rate to 2MHz.
Basically, it works like this: ADC rate (RF sample rate) → HW decimator → (Final )sample rate

1 Like

Ok! Thanks @IgnasJ