Transmit Samples Flow Control

Will the mini sdr v2 take transmit samples at a faster rate than needed? Does the application itself need to make sure it is only writing samples at the rate that it has set the mini v2 to transmit at?

M-

The device itself has two options for transmitting samples: as soon as possible, and synchronized to timestamps.
If I remember correctly the hardware can buffer up to 16KB of data, which depending on data transfer format 16bit, or 12bit values, would have 4080 or 5440 complex samples of buffering available (one sample is I and Q value pair).
Without timestamps: once the samples are received by the hardware, they are buffered, and transmitted as soon as possible sequentially. If the samples are sent to hardware slower than the sample rate, it will result in gaps between packets transmission. If the samples are sent to hardware faster than sampling rate, the data is going to be buffered, and the device will transmit all of sequentially. Once the hardware buffers are full, the USB data transfering will wait until there is available space in the FIFO buffer.
With timestamps: each packet has a designated time when it has to be transmitted. The hardware will buffer up to 16KB of data, and will wait until the designated packet time to actually transmit it. If the hardware receives a packet, that is already late compared to designated timestamp, then it will drop the entire packet, to be able to catch up to real time (when using timestamps, the hardware can indicate if there was any late packets).

On software side, LimeSuite buffers a lot more data, so it will accept the samples at much faster rate than the set sampling rate, until the software FIFOs are full, and will transfer the data to the device when it is ready to accept it.

So I am generating 16-bit I, and 16-bit Q samples. The Lime sdr is set to 12 bit sampling. I assme that is ok.

So the transmit side will ‘push back’ when too many samples come down. effecitvely throttling the transmit side of my sofware.

M-

Yes, you are correct.
The conversion between 16bit samples on the host side and 12bit transfer format is done by LimeSuite internally. Since you’re concerned by the host side CPU performance, it might be more efficient to use 16bit transfer format, that way, on the host side it would be just simple memcpy operation. When you’re using 12bit transfer format, CPU has to do the bit shifting conversion/compression, so CPU has to do more work to process the data, but the transfer data rate is lower.