minLatency() in osmo-trx

This is merely a historical archive of years 2008-2021, before the migration to mailman3.

A maintained and still updated list archive can be found at https://lists.osmocom.org/hyperkitty/list/OpenBSC@lists.osmocom.org/.

Harald Welte laforge at gnumonks.org
Mon Apr 8 06:51:22 UTC 2019


Dear all,

the following question popped up on IRC:

22:08 < roh> is there some documentation how the minLatency() call in osmo-trx works?
22:10 < roh> it looks to me like the values we use for usrp1 and lms are not correct/cargocult

I agree, the values are likely wrong for the non-UHD devices.  

Interestingly, they are auto-tuned at runtime, see the following piece in Transceiver.cpp:

          // if underrun hasn't occurred in the last sec (216 frames) drop
          //    transmit latency by a timeslot
          if (mTransmitLatency > mRadioInterface->minLatency()) {
              if (radioClock->get() > mLatencyUpdateTime + GSM::Time(216,0)) {
              mTransmitLatency.decTN();
              LOG(INFO) << "reduced latency: " << mTransmitLatency;
              mLatencyUpdateTime = radioClock->get();
            }
          }

However, that block only applies to devices with TX_WINDOW_USRP1 set, that is
USRP1, B100 and B2xx devices.

In fact, I cannot find any user of the minLatency() method outside the context
of TX_WINDOW_USRP1, and hence I think it doesn't matter what kind of magic
value the LMS driver supplies?

So at least I conclude:
* it's only ever used on USRP1, B100 and B2xx, and it is dynamically adjusted
  at runtime on those platforms

Regards,
	Harald
-- 
- Harald Welte <laforge at gnumonks.org>           http://laforge.gnumonks.org/
============================================================================
"Privacy in residential applications is a desirable marketing option."
                                                  (ETSI EN 300 175-7 Ch. A6)



More information about the OpenBSC mailing list