I am using F5OEO’s Digital ATV software to generate a DVB-S TV signal from a Raspberry Pi 3B (Stretch lite) and a LimeSDR Mini ([LimeSDR Mini, media=USB 2.0, module=FT601, addr=24607:1027, serial=1D3AC72FDA3062).
I have a build that I created a few months ago that uses LimeSuite 18.04.1, and I get a good waveform and good spectrum from it.
I have since tried a number of builds using LimeSuite 18.06.1 and the waveform is poor (bad MER) and the spectrum is awful.
I have updated the firmware and gateware on my Lime Mini and the problem remains.
Can anybody give me a clue as to where I should look next?
The F5OEO code that I am using is here limetx.c on GitHub
When I ran firmware version 18.06 on LimeSDR-USB from a Raspberry PI 3B running Lubuntu 16.04, I only saw the Local Oscillator (LO) output and not the signal. And I only saw LO until I connected the LimeSDR-USB to a Window’s USB3.0 port.
I think it might have something to do with libusb-1.0-0-dev. I was getting better results with firmware version 18.04 and libusb-1.0-0-dev 1.0.22, but not great.
[Update] But your question was about the LimeUSB-Mini. I checked the history on LimeSuite and here on gitrev 7908eea, you can see that the firmware version of LimeUSB-Mini changed from 24 to 26. As I stated earlier, I have had no success with LimeSDR-USB firmware revision 18.06. But I did have success with a brand new (delivered this morning) LimeSDR-Mini. Assuming that my LimeSDR-Mini has firmware version 26 (because LimeSuiteGUI didn’t flag a version mismatch even when I tried LimeSuiteGUI 18.04 and 18.06), I suggest trying LimeUtil --update, and then try running basicTX from the LimeSuite build folder. I tried to run limetx.c, but i only got errors (probably bad input file).
Thanks for your suggestions, but no success.
Both builds are using libusb-1.0-0-dev:armhf Version 2:1.0.21-1, so no difference there. I then checked and swapped (and flashed) firmware. I get identical results with Version 24 and Version 26 (the old build works fine with either, the new build doesn’t work well with either).
limetx.c needs a transport stream input file to work properly, it’s difficult to run on its own.
I note that the RF output is 6 dB higher on LimeSuite 18.06 than 18.04 (at the same gain), so I am now investigating if that is a clue.
I confirm that 18.04 to 18.06 introduce this strange behavior.
I suspect inverse sinc digital filter or something in digital filtering has been changed between the 2 versions (mainly by seeing the spectrum).
I look for commits to find if I could explicity see something, but not yet obvious.
Have pinged @IgnasJ to see if we can pin this down and fix.
Do you observe the problem with release 18.06.0 or just with recent GitHub version (18.06.1)?
I do not observe the problem in release 18.06.0 (commit 1cb1723). So it is introduced somewhere between 13 June and 13 July. I am now working my way forward a commit at a time to find when the problem starts. Unless you have a better idea?
Edit: The above statement is incorrect as I was not overwriting all the old files with the new commit. Apologies!
See correction below
My testing is with latest commit (commit 373e26aba248ca38d4d49761eabf9be605c288a8).
That’s strange that Dave mention that he has a working system with it.
Wait for feedback from Dave who have surely modify something.
I was mistaken in my earlier post. Having now written a script enabling me to move back and forth between commits (slowly, 15 minutes each), I can now confirm that the problem was not present in release 18040 (commit d6bc28f, 6 April), but is present in Commit 582e828 (May 22) and all later commits.
I’m working on finding the actual commit that caused the problem, but it will take time.
I have now confirmed that the problem was introduced in LimeSuite Commit ca7d657 on 22 May 2018. Commit 17c3e05 on 17 May works perfectly.
All tests conducted with firmware /18.04/LimeSDR-Mini_HW_1.1_r1.24.rpd as appropriate for these builds.
These are the changes in the commit: https://github.com/myriadrf/LimeSuite/commit/ca7d657e85e2766971a359fa77021b0295fed9e3
@IgnasJ - there are 7 changed files in this commit, but a number of the changes appear to be linked, and I’m way out of my comfort zone trying to partially implement the commit to troubleshoot any further. Any ideas?
Edit: Further information:
Of the 7 files changed in the troublesome commit, 2 have independent changes: LimeSDR_mini.cpp and lms7_api.cpp. I have implemented the changes to these files and I do not get the problem.
The other 5 files deal with changes to SetTBBIAMP_dB and opt_gain_tbb. It appears that these introduce the problem.
I haven’t done any testing but I have one idea. Does DVB application set Tx gain at any point? I have suspicion that it doesn’t and then in SetLPF() get gain and set gain combination sets gain to high.
The DVB application runs this command:
LMS_SetNormalizedGain(device, LMS_CH_TX, 0, gain);
where gain is a float between 0.0 and 1.0. Is this the one?
LMS_SetNormalizedGain(device, LMS_CH_TX, 0, gain);
What are the constraints on when it should be run? Possibly it is run in the wrong sequence compared with other commands?
Edit Changing ‘gain’ does change the output level as I would expect, but does not change the (bad) waveform
2nd Edit I have changed the code to set the gain (using LMS_SetNormalizedGain) before the sample rate and filter setting but I still have the same problem.
A fix is to remove the call "LMS_SetLPFBW(device, LMS_CH_TX, 0, m_Bandwidth);"
But https://discourse.myriadrf.org/u/ignasj please explain why calling this is not correct.