I am trying to modify the dualRXTX example that comes with LimSuite, basically I am using the original code
by rather than echoing back the received samples I am transmitting my own samples instead. Trouble is I can’t
get it to work.
The Lime tunes to the right frequency and when I call LMS_SendStream it returns the number of samples
I gave it. But nothing is visible on my spectrum analyser. Well that is not quite true, I can see the centre
carrier leakage but no modulation.
I was expecting SendStream to block when the FIFO is full when timeout is set to zero but all it does is
return 0 samples sent immediately.
It seems like everything is working except the DAC is not being updated. Anyone got any suggestions?
I am afraid that didn’t make a difference.
I did notice that setting timeout to 0 sets the timeout to 0 and does not do what it says in the header file
which is sets timeout to infinity.
I have checked the values I am sending to the device which are valid. GetStreamStatus is reporting
sensible information and no LMS calls are returning any error codes.
It is as if the first sample in a block goes to the DAC but nothing further. I am sending a complex
signal that should output a tone 1 MHz from the carrier frequency.
The example appears to work so it is obviously something I am doing that is wrong.
I have the added complexity of the fact I am compiling my code with NVIDIAs nvcc
as I am working on a program that uses CUDA and GPUs to do the signal processing.
I recompiled the limesuite example using nvcc and it performs in the same way as the gcc
version does. I will have a further look at what I am doing before I post the code.
I also currently have the Lime Calls in a thread so I need to check that is not the problem.
Seems like the problem is down to flow control of the LMS_SendStream.
The example relies on using RecvStream and timestamps to flow control
the Sending. If the Send queue overflows it breaks it.
The sampleRate field in the status message always returns 0 on both Tx and Rx.
So I am going to have to re-design my code to flow control the transmitter.
Shame there is no transmit only example. I suspect that is for the reason
I have just encountered.
Hi Andrew,
Yes I am getting somewhere. My application will require the USB3 interface to
operate at it’s maximum.
I have found that setting timeout to 0 in LMS_SendStream is not a good idea.
The header file seems to indicate that should set the timeout to infinity. It seems
better to set it to a large value instead, setting it to zero’s increases signal dropouts.
I have also found that to get maximum throughput it is essential to set the channel
to 1.0 in the latencyVsThrougput field.
I have tried increasing the value of the fifoSize field in the stream configuration but
whatever value I set it to GetStreamStatus always reports a fifoSize of 8192.
I think to get the maximum possible throughput I need a much larger fifo (I am not worried about
latency).
Basically I would like to run the board flat out in full duplex on a single channel.
Hi,
Looking at the fifo code I can see that there might be a bug that ignores timeout and overwrites the oldest data immediately when fifo is full and synchronization to timestamps is enabled.
You can try disabling synchronization to timestamps by setting:
tx_metadata.waitForTimestamp = false;
Also, you are right about incorrect description of timeout=0 in the header file.When fifo is full it doesn’t block at all and return with 0 samples written. Use reasonably high timeout value for blocking instead.