TX signal sample rate issues

Hi,

I’m using SoapySDR to transmit a QPSK signal. The symbol rate of the generated signal is 3.375 kHz (6.75 kbps data rate), RRC filtered (r = 0.6) with 8 sps to 27 kHz, then resampled x186 to 5.022 MHz before being sent to the SDR using CF32 format through SoapySDR. The RX and TX sample rates are both set to 5.022 MHz and for my tests I’m using the ISM band at 914 MHz with rather well adapted antennas (verified with a Nano-VNA to have ~1.4 VSWR – best I could get with those antennas).

When using the USRP B200, I get a nice signal with a signal bandwidth of 5.4 kHz (Fsym*(1+r)), measured with a spectrum analyzer and an RTL-SDR. I’m also able to decode the signal I’m sending.

When I use the LimeSDR-USB, the same signal has a bandwidth of 10.8 kHz, measured with the same RTL-SDR (I do not have access to the SA anymore, but I trust this measurement as other known signals look just fine). In order to get the right signal, I need to send twice as many samples as for the B200 for the same sample rate (or half the sample rate but send as many samples), which just makes no sense.

A data burst of 120 ms at a symbol rate of 3.375 kHz is 405 symbols, which then resampled to 5.022 MHz should be 602640 samples, which is exactly the number of samples I get after resampling, and the number of samples I send to the SDR.

I tried exporting the signal to a ‘cfile’ (which really is just complex written in a binary file) and playing it back in GNU Radio Companion. The signal that is sent to the LimeSDR, which gives me 10.8 kHz over the air, is measured at 5.4 kHz of bandwidth in GRC. Even retransmitting the recorded signal through GRC (which uses UHD) into the LimeSDR requires me to set the sample rate to half what it really is (double the number of samples sent per what it really is).

I also tried to play around with the driver source code, and SoapySDR source code, but to no avail, I could not find anything regarding this issue. I did try the latest tagged versions and the master branches of every library I use.

I cannot share code as it is proprietary (code is for one of my clients), but I can answer any question. I might be able to share the generated signal, as this is what’s being sent over the air anyways.

I am on Ubuntu 18.04 LTS with all of the latest updates. Tried USB 2.0 and 3.0 with no change. I don’t have access to any other device that can transmit (social distancing / work from home), but it was confirmed working on a USRP B200 (confirmed by myself).

Thanks,
JD

Assuming you’re using GNU Radio with gr-osmosdr blocks, would it be possible to swap these out for gr-limesdr and therefore take a shorter path to driving the SDR?

I was using gr-uhd (which uses SoapySDR). I tried with gr-limesdr and I do not have the issue anymore. Thanks! Edit: Disregard this, it was just not updating my model.

This is a strange issue with SoapySDR. I guess I’ll just directly use the LMS driver instead of going through Soapy. My goal was to build an SDR-agnostic application for my client, but I guess I’ll have to implement individual drivers per device instead.

I have more insight on this problem. I am using an NCO (software DSP) to move my signal away from the radio center frequency (which has a massive DC spike for some reason – other manufacturers don’t have this issue, but their issues are different and still troublesome). I am performing a simple rotation of the signal in frequency to its desired frequency after upsampling. When I use my NCO, the issue appears. When I don’t, the issue is gone (but because of the DC spike, I have other issues). However, the same signal without NCO (therefore correct in its output), when applying a “Frequency Xlating FIR Filter” from gnuradio, exhibits the same problems. This rules out a bug in my own NCO implementation, as I know gnuradio’s implementation is solid (it’s a simple multiplication by a time-varying exponential as a single-rate block).

However, now that I know the issue is with SoapySDR, I’ll just use LMS directly and that should fix the issue.

Any idea on how to get rid of the DC spike at center frequency? It’s a big no-no for my client as this is piped directly to a satellite and would waste a lot of power.

Thanks,
JD

Update: After fixing issues with gnuradio 3.8 using the 3.9 blocks, using gr-limesdr did not fix the problem. It still transmits the signal too wide when translated.

Here is my signal file: https://www.dropbox.com/s/y09h1tjsh5quc5q/sig.cfile

The signal is not filtered before output (other than RRC filter) because I deactivated almost everything in order to narrow down the problem, but the final signal will have a windowed filter applied to it to concentrate the energy – this is not part of the issue.

The signal is a QPSK signal at 3375 sym/s with RRC of 0.6 and thus should have 5.4 kHz bandwidth (minus the aliasing). The sample rate is 5.022 MSps.

When transmitted at center frequency, the signal is 5.4 kHz wide. When passed through a translating filter, or just multiplied by an NCO, the signal is 10.8 kHz wide. My gnuradio companion setup is a follows:
File source (repeating) -> Xlating FIR Filter -> LimeSDR Sink

I vary the center frequency of the LimeSDR and change the Xlating filter frequency so the output is always at 915 MHz (my test frequency). When the Xlating filter frequency is 0, the signal is 5.4 kHz. When it is non-zero, it is 10.8 kHz. The same can be observed if I generate my signal at an offset frequency using my own translating filter (unrelated to gnuradio) and bypassing the Xlating filter.

Thanks,
JD

Not a good idea, as this involves a lot of levels of redirection: gr-uhd > UHD > SoapySDR > LMS API. I mean, it should work in theory, but it’s about the longest path you could take and I suspect the most difficult to debug.

A slightly less tortuous route might be to use gr-osmosdr, which can also use SoapySDR for device access. However, last I heard gr-osmosdr wasn’t being maintained and so this, on top of the fact that SoapySDR will only ever give you access to a subset of the SDR capabilities, led us to create gr-limesdr.

@garmus, could you advise on the other issues?

I did try gr-limesdr and it produced the same output, so I doubt it’s related to any of those libraries. I can try and build any version of the drivers if you need me to.

My device passes all self-test, if that can help to rule out a few things. I wonder if anyone else tried to transmit the signal file provided and got different results.

Hey @TehWan,
Thanks for posting such descriptive issue.

To check the problem out I first did some basic tests using gr-limesdr and NCO. My basic test flowgraph was Noise source->LPF(500kHz)->LimeSDR(TX) on transmit side and LimeSDR(RX)(different board)->QT GUI sink. As expected, the whole signal bandwidth is 1MHz and when I set TX NCO to 1MHz signal moves to the desired location without change of bandwidth.

After downloading and trying it with your provided signal I noticed the problem you were describing, meaning that the bandwidth definitely increases when using NCO. Looking at time graph of your signal though, it sometimes gets above [-1;1] range and multiplying signal by 0.8 fixes the bandwidth problem(see screenshots below).

With full range of your signal, NCO on RX side and Xlating on TX side:
xlating(nco)_limenco_fullrange
When your signal is multiplied by 0.8, NCO on RX side and Xlating on TX side:
xlating(nco)_limenco

Regards,
Richard

Hi Richard,

Great find! Thank you so much. I’ll add conditioning to the signal to remain between [-1; 1].

As to why it doubles the bandwidth when the signal exceeds those values, no idea. Other SDRs seem pretty tolerant with this (not sure how they manage it), but if the value is clipped, it could lead to aliasing, which might explain the wider bandwidth. However, the values should never go beyond that range, so that’s all my fault.

Thanks,
JD