Generate chirp signal using Lime SDR

I am a newbie to use Lime SDR. I want to generate a band of frequencies (chirp signal of 60MHz Baseband bandwidth; mixed with 2GHz VCO). the SDR should generate the frequencies independently without any commands from the PC. So i tried to program the FPGA to generate the frequencies. the data samples(generated using Matlab) are stored in FPGA.

I am yet to get result. Please help me.

I am able to generate the chirp signal as expected. but still the VCO is tuned via GUI. can anybody help me in programming FPGA such that as soon as it gets powered ON, it tunes the necessary PLLs and VCO to required frequency, and also configures the LMS7002 as required.

I would like to loopback this output to the receiver, do some processing. So i need help in setting the clock rate for the Rx path (the settings are from FPGA itself, not from the GUI --> FPGA).

Now theres a piece of kit I haven’t seen in a while !

I could successfully develop FPGA firmware such that the FPGA is totally in control of LMS7002 device. I dont need the Lime Suite for the operation except for flashing the FPGA firmware. I am able to generate different band of frequencies. I can see the spectrum on the spectrum analyzer. Right now i have fixed my centre frequency to 2200 MHz and band of 100 MHz (Chirp Signal).

Now i am trying to receive the data which is actually the loop back of the generated signal as above. I am receiving the signal and sending the data to PC via USB (USB communication verified by sending various test data from FPGA to PC).
I am not able to receive valid data. I mean to say that irrespective of connecting loop back to the receiver or not, i am getting similar data. which indicates that the received data is not correct. I am not sure if the configuration of LMS device to receive the data is not right?? or the protocol i am using to receive the data from the LMS device is not right???
Can someone please help me in the correct configuration of the LMS device for proper reception of the signal. I am using Port 1 for Transmit (TX2_2) and Port 2 for Receive (RX1_H). I would prefer solution for MIMO mode. Solution for SISO mode is also very helpful.

I am using LimeSDR-USB v1.4

Thanks in advance.

Hello @chandu,

Are you using LimeSDR-USB gateware as a basis for your project or you are doing everything from scratch?
Just in case. Here is gateware repo:

Hello @Zack,

Thank you for the response. I have developed the firmware for FPGA from scratch. I felt this was simpler as compared to take the gateware as basis and modify to suit my requirements.

I am suspecting if the registers i have configured for the LMS device may not be correct. can you please provide me the correct register configuration??

I have also noticed that MCLK2 is not getting generated if ‘RxTSPCLKA’ is selected as source for MCLK2 . If i select ‘RxTSPCLKA after divider’ as the source, MCLK2 is generated. Currently i am working with later option with divider value set to Zero(actual division as per datasheet will be 2*(divider value + 1) = 2).

Hi @chandu,

OK, but this is still a good source to check how to implement interface between FPGA and LMS7002M I think.

Could you clarify what digital functions do you nee, what SXT, SXR and CGEN frequencies, what interpolation/decimation ratios, please.

Do you apply reset to LMS7002M initially?

Yes. I started with that. But I got confused initially and thought of doing it myself from the scratch. Now that I am able to talk with LMS7002M via FPGA, I continued with that.

I need to generate SXT and SXR frequency = 2100 MHz. I want ADC and DAC to work at 160 MHz. At Transmit side the signal is interpolated by 2 and at the receiver side the signal is decimated by 2. If you need more details please ;et me know.

Yes, After Power ON, before the LMS7002M gets configured, It is being reset by the FPGA.
Now I am getting MCLK2 for both conditions. It was my mistake in some calculation.

It gives that MCLK/FCLK frequencies should be 160MHz which is too high.

Can you please elaborate on that? what should be the optimal speed for MCLK/FCLK??

The optimum would be up to 125MHz.

If that is the scenario, then i would not use interpolation and decimation.

the RF settings i have used are as follows:

       TX1_2_LB_L = '1'; 
       TX1_2_LB_SH = '0';
       TX1_2_LB_H = '0';
       TX1_2_LB_AT = '0';
       TX2_2_LB_L  = '1'; 
       TX2_2_LB_SH = '0';
       TX2_2_LB_H  = '0';
       TX2_2_LB_AT = '0';

I think these settings are right.

I will reduce the clocks for MCLK/FCLK to 80MHz and check the operation again. Meanwhile can you please provide me the settings for the reduced speed.

Hi @chandu,

If you want to bypass decimation/interpolation, then you still have to generate 160 MHz, just externally. The explanation is as follows:

  1. You have 4 ADCs running at 80MHz in parallel; this is 4 Samples * 80 MHz = 320 MSps (mega samples per second);
  2. We have one external IQ data bus. You have to transfer 320 MSps data through this single bus which translates to 320 MHz MCLK/FCLK frequency when SDR and 160 MHz MCLK/FCLK when DDR.
  3. As escribed in item 1, ADCs hence TSPs as well are running @ 80MHz. And this frequency is used to construct MCLK frequency. But from item 2 we see that we need 160 MHz which is twice of ADC or TSP frequency. There is no facility to double the clock in LMS7002M hence you have to generate this frequency externally which is anyway too high for the interface.

Thank you for the explanation.
Instead of feeding clock externally, i would like to reduce the sampling rate of ADCs and hence TSPs to lower speed so that i can generate a proper BB signal of 60MHz bandwidth, LO=2100MHz; and loop it back to the receiver and verify.

Meanwhile I tried with original GW, LimeSuite GUI with following settings:
CLK_H = 640 MHz, Divider = /4, FCLK_H given to ADC. The GUI was showing RxTSP freq = 160 MHz and TxTSP freq = 160 MHz. I tweaked only the test signal data in the GW to generate chirp signal.

with these settings i monitored the externally looped back signal using FFT viewer in Lime Suite. I could see the chirp signal.

Now I will try using the register settings from this (saved as *.ini file) into my FPGA firmware.

Sorry, but I do not believe the signal is correct :slight_smile: Chirp signal just masks interface failures at 160MHz interface frequency. Try to check using CW instead of chirp and you will see.


Thank you. now i will modify my design to work with lower speed of 80MHz for both TSP. ADCs and DACs also run at 80 MHz. I have looped back to RX1_H from TX1_2. i have made the following settings. please correct me if i am wrong.
TX1_2_LB_L = ‘1’;
TX1_2_LB_SH = ‘0’;
TX1_2_LB_H = ‘0’;
TX1_2_LB_AT = ‘0’;
TX2_2_LB_L = ‘1’;
TX2_2_LB_SH = ‘0’;
TX2_2_LB_H = ‘0’;
TX2_2_LB_AT = ‘0’;

Hi @chandu,
Could you clarify please what kind of loopback are you implementing? LMS7002M internal RF, onboard RF or simply using a cable.

I am using external loopback using a cable.

Then it should be as follows (on board RF loopback disabled):

TX1_2_LB_L	= 1
TX1_2_LB_H	= 0
TX1_2_LB_AT	= 0
TX1_2_LB_SH	= 1
TX2_2_LB_L	= 1
TX2_2_LB_H	= 0
TX2_2_LB_AT	= 0
TX2_2_LB_SH	= 1

You may check in Quartus project how the control logic is implemented:

Thanks for the information.
unfortunately, I could not work on the board for quite a while. Now i have modified the settings as you have mentioned. I will let u know of the result.