LimeSuiteNG version for LimeSDR_GW v3.5 with XTRX on LimeFEA mPCIe

What version of LimeSuiteNG should I be using with LimeSDR_GW v3.5?

The documentation isn’t exactly clear which versions of the various bits should be getting combined to make a working system. I’m assuming LimeSDR_GW is now the preferred gateware?

I’m seeing some unexpected behaviour between gateware v1.22 and v3.5.

I’m running a LimeSDR-XTRX v1.2 in a LimeFEA mPCIe v1.0, on Ubuntu 24.04 on x86. My board shipped with gateware 1.17. Before jumping to the new LiteX base gateware, I updated to v1.22 (this bitstream). I used the latest develop branch of LimeSuiteNG at bd3c132, since that seems to be the recommended procedure.

Running this with the Amarisoft 5G stack (release 2025-12-12), (TDD, in test mode to pack the channel), I get a sensible spectrum:

I then updated the gateware to v3.5. With the new gateware, my spectrum becomes nonsense. It appears that I get one tiny burst of RF once every 5 ms and silence the rest of the time.


You can see from the time plot I’m only getting a tiny sliver of RF output, hence the terrible spectrum. It’s a little suspicious it’s at what would be the DL/UL transition.

I noticed there were some changes to the Amarisoft plugin in LimeSuite bd3c132 so tried rolling back a bit. I tried to pick a commit that seemed like it would have been used for testing v3.5 (released 2025-12-22), but that avoided any very recent commits that mentioned limepcie as that seems like a potential suspect. I went for 05199ea. That combo seems to get me a carrier and nothing else:

I would really rather be on a “known good” combination of the various gateware/drivers/software before I start going down a rabbit hole debugging.

On a more general point, as a new user to the Lime ecosystem, it is very hard to find all the right documentation. There are lots of things with very similar names, and everything is very spread out and not very well cross referenced. A “Getting Started” guide for a particular device would be really helpful. I’ve often found myself needing to jump between repos or dive down into schematics to answer what feel like very basic and common questions. Tagged versions of all the various gateware/drivers/software (with a compatibility matrix) would also be very helpful. I would love to see the tagged releases approach that LimeSDR_GW is taking being used across all the repos, with releases being tested together to have some battle hardened stable combinations.

Tagging @ricardas and @VytautasB as this issues looks to be at the interface between your two projects.

Cheers,

Iain

1 Like

I have the same experience. I recently acquired another XTRX and now have two XTRX both are LimeSDR-XTRX v1.2 in a LimeFEA mPCIe v1.0. The working original runs GW 1.22, the new board came with GW 1.17. I foolishly upgraded the new board to GW 3.5 without in depth testing but maintained the original as it was. The new board displays similar symptoms to what you describe which is a relief in that perhaps my new board isn’t DOA (a fear). I am using LimeSuiteNG 0.3.0-gbd3c13222 and testing with a limeSDR-USB as a signal source. The original XTRX runs as always with limeGUI and the gnuradio plug in both in SISO and MIMO modes. The new one streams but nothing sensible in the output stream. I’ll watch this post with great interest.

You can revert the firmware to v1.22 without issue if you want to make sure it’s not a hardware problem. I jumped back and forth a few times while debugging. Juat make sure to do a full power cycle to get the new version to boot (i.e. I had to shutdown my PC then start it again. A reboot didnt cycle power to the PCIe).

That was my backup plan and thanks for the reassurance. I will undertake a bit more investigation before reverting to 1.22 as a troubleshooting/debugging exercise. I am curious to know what could cause this.

Hi looks like 3.5 GW has some bugs in TDD controls. I will investigate it further and provide fix if it is GW related. In meantime you can safely revert to 1.22GW.

@IainCC would you be able to test the fix:

Regards,
Vytautas

Currently on holiday. Will be back in the lab on Monday and will check it first thing. Thanks for the quick response.

I think I did check an n1 FDD band and saw the same problem, but I don’t have access to my lab book so don’t quote me on that!

By the way are you using LimeFEA mPCIe v1.0 Full and LimeSDR-XTRX v1.2 is connected to LimeFEA RF frontend or are you using Lite version and antennas are connected directly to XTRX?

I am using the Lite version of the LimeFEA and I make RF connections directly to the PCIe mini card. What I find odd is that saving the configuration of cards with both versions using limeGUI, I only see a few of what I perceive to be harmless differences between the chip configuration files.

Full version for me, with connection via the TRX amps. As a side note, my power seems unexpectedly low (as if no amp) but I havent looked into that yet.

I thought I might be seeing Tx power through the switch when it was configured for Rx but supposed to be Tx (i.e. powers were about 20dB lower than I thought they should be, which would be suspiciously close to a lot of switch isolation values). I did try poking the 0x000A register mentioned in the original gateware doc but couldn’t quite get it working as I was expecting. Ran out of time on Friday to delve any deeper and that’s when I posted. Was going to leave the power issue for another post but if it’s a problem with the TDD switching it might be related.

That seems to have sorted the main problem. I do see a strong carrier during Rx slots though, which I wasn’t getting in the v1.22 GW.

Note that the spectrum in the 2nd plot is gated to only capture during UL slots.

That’s a common issue with SDR systems: you will get feed-through at the actual center frequency, both on RX and TX. You can try to notch it on RX (the Lime has circuitry for that), but it will create a notch at center frequency instead.
I’ve always designed my SDR apps to offset the actual hardware center frequency from the desired passband, and then shift the signal in the digital domain to compensate - that way the center spike is not in the signal of interest.

@N0YKG I would always expect a bit of feed through, but the level I’m seeing is excessive. More importantly, it’s much higher in v3.6 of the gateware than it is in v1.22.

For comparison, this is the gated spectrum with v1.22:

@IainCC thanks for testing and reporting results. I will investigate it further.

Could you check and post value of 0x000A FPGA register while Amarisoft 5G stack is running? You can check it with LimeGUI→ Modules→ SPI → FPGA or using limeSPI cli:
limeSPI read --chip=FPGA --stream=000A

There are two stages where RF switches are controlled and has to be setup to work in TDD mode:

  1. LimeSDR XTRX RX and TX onboard RF switches. In TDD mode rf_sw_auto_en 0xA[11] has to be set to 1. This will control RX switch to toggle between path selected in rx_rf_sw 0xA[3:2] when receiving and No connection when transmitting. TX toggle between path selected in tx_rf_sw 0xA[4] when transmitting and inverted/non active path when receiving.
  2. LimeFEA mPCIe v1.0 RF switches: LimeSDR XTRX provides external TDD signal to LimeFEA board which by default is disabled so tdd_auto_en 0xA[6] has to be set to 1 to enable it. By default this external TDD signal goes high when transmitting and low when only receiving. This logic does not match with LimeFEA mPCIe v1.0 control logic and tdd_invert 0xA[7] has to be set to 1 to invert external TDD signal.

Attaching register description for reference:

Also can you post how RF ports are wired between XTRX and LimeFEA board on your setup in TDD mode?

thanks,
Vytautas

I have a simple test bed (attached drawing) and accompanying GR flow graph that may facilitate a check of changes in GW by readily configuring both RF connections and data flows between two identical XTRX boards (v1.2) in LimeFEA Lite mPCIe v1.0 and a limeSDR-USB. Combinations of XTRX-USB, XTRX(0)-XTRX(1), XTRX-(0)-XTRX(0), XTRX(1)-XTRX(1) are readily supported although a single cable connection is needed to bypass the USB SDR for XTRX-XTRX tests. I am currently testing XTRX boards with GW v1.22. I’m willing to be a guinea pig for new GW versions. However, I don’t have ready access to equipment to recover from a botched test with JTAG so this offer only extends to a stable GW upgrade.

While running, on both GW, 0x000A reads as 0x0819. I did poke at the register a bit previously, and remember trying 0x0859 but I don’t remember the outcome or which version of the GW I was running at the time (although it wasn’t likely to be a big success otherwise I would remember!). I don’t think I tried inverting (0x08D9).

The board is currently out of my PC while I make an external enclosure. I should have it back together tomorrow afternoon and can do any extra tests you want.

I did a quick rewire today and did direct connections to the XTRX Tx/Rx to sanity check amplifier gains. I’m getting a little over 16 dB gain from the amps, which seems a little on the low side for 2 GHz. Previously I was TX_A (XTRX)-> TXin_A (FEA), LNAout_A (FEA) → RX_A (XTRX), TRX_A (FEA) to external SMA. I also have LNAin_A (FEA) connected to an external SMA. Image below was the original wiring as described, and that all the screenshots and testing so far was done with.

I really hate U.FL - I’m always afraid I’ll snap something or rip a connector off!

As a side note, the amount of low-level information needed to get TDD working on different SDRs/dev kits seems very high for such a common task. It would be great if this was abstracted out at a config level. If I could tell LimeSuieNG “I’m using an XTRX in an FEA full”, and let if figure out all the low level register values, that would make everything a lot easier to use. Failing that, some extra doc and examples for each board would be really helpful.

Very much appreciate all your help so far!