LimeSDR USB inconsistent configuration and sample rate

I have set up what I believe to be one of the most basic flowgraphs possible (using gr-limesdr in GNU Radio 3.8 and Ubuntu 18.04), but I am getting some pretty wonky behavior. I have two issues:

  1. The RX and TX signal frequencies do not match. I am sending an 800Hz signal and am receiving a 1600Hz signal! Yet both the LimeSDR Sink and Source are set to the same sample rate. I can “fix” this by upsampling by a factor of two, of course, but I shouldn’t have to.
  2. Each time I run the flowgraph, the RX looks (and sounds) different. Sometimes the amplitude is higher/lower/in-between. Sometimes it’s a little more distorted than other times. I am loading the same .ini file every time, so I expect the radio to configure itself the same way each time. Interestingly, RX is always consistent within each run, but changes if I stop and start another run.

The .ini file is almost identical to the self_test.ini, but with a little boost of the amps in RX and TX and centered at 900MHz.

Here is my flowgraph and some examples of the RX signal varying across different runs.



Doesn’t it seem like the LimeSDR is not using the sample rate I tell it to (either at TX or RX) and that it’s configuring itself slightly differently each run? Anyone have any ideas what could be causing this?

Tagging @garmus.

Found something interesting looking through the gr-limesdr source block code. The call to set_sample_rate is only made if filename == "". Since I’m loading a .ini file, my GNU Radio defined sample rate isn’t being used–it’s using the one defined in the config file. Seems like a reasonable behavior choice – it would be good to mention this behavior in the block’s documentation.

After learning this, I re-examined my ini config and found that the RX sampling rate was half that of TX–that explains issue 1. That seems to just be the setup for selt_test.ini. Is that supposed to be the case?

Still not sure why the amplitude varies each time I re-run this flowgraph. Is the LimeSDR dynamically scaling some of its amp gains to try to get the signal to an optimal size?

1 Like

Thanks, added an issue:

@Garmus should be able to comment on the other points.

Hello,

Regarding the self_test.ini file I cannot answer this as I’m not the one who made it. It might have been made that way to test different sampling rate functionality.

There isn’t any dynamic scaling for signal in either LimeSuite or gr-limesdr and as you mentioned the signal should stay relatively the same between runs. RX signal amplitude can vary wildly by just moving antennas or having any other signal interference. Is it possible for you to connect a RF cable with attenuator and lower gain values and try to test again?

If I understand correctly, when you say "“within each run” you mean just starting and stopping the flowgraph and by stopping and starting each run you power off the device completely? If so, device might be showing sensitivity to different temperatures.

Yes, I will try to connect a cable/attenuator but it might be a few days. I’ll post the results of that test once I have them.

When I stop and start, I am not powering off the radio–just clicking the stop/start buttons in GRC.

Perhaps this is related – I frequently get glitches in the RX signal (you can see an example in the center of the first graph I posted, but here is a zoomed in version). Is this expected behavior?

Another discovery:
With the ini configured for 20MHz sampling at both RX and TX and GRC set to 20MHz, the output sounds choppy, but if I set GRC to 2MHz (with the ini still at 20MHz), the output sounds much clearer (both are the correct pitch). I assume the choppiness is just my computer not being able to keep up at 20MHz. Since adjusting the ini sample rate and the GRC sample rate both have an effect on the rate of processing, doesn’t that indicate that GRC is using a ratio (as opposed to an absolute value) to resample the hardware input to the GRC-defined rate? This could be a good note to add to the documentation if this is indeed the case.

The variations between runs have gone away; as far as I can tell, I’m not having any more unexpected issues. Since no one seems to have any other specific leads as to why this happened, I’ll just mention that the only thing I can remember doing between then and now is reloading the firmware and reverting to the stock self_test.ini file a few times.

To answer your previous question about RX glitches, it is most likely dropped packet since you are sampling at quite high rate.