Dear Osmocom community,
Over the past several months I've been working almost exclusively on
improving FR1 and EFR speech handling in the Osmocom GSM network
implementation. All of my Gerrit patches since March have been in
this area, and my two Themyscira-branded public domain libraries for
GSM codecs are also primarily intended for use together with Osmocom,
specifically for implementation of transcoding media gateways that
interconnect an Osmocom GSM network with a non-GSM outside world
such as G.711 PSTN.
Given the knowledge I've gained over months of working in this area,
and seeing that many other Osmocom developers aren't particularly
familiar with these aspects of the specs (understandable: GSM is huge,
can't keep everything in one's head), I would like to do an OsmoDevCall
presentation on the topic of GSM speech handling with traditional
non-AMR codecs. I would like to cover the following subtopics:
* What metadata bits (BFI, UFI, SID, TAF) are defined in the specs for
transport of encoded speech between network elements, beyond the
familiar speech codec bits themselves.
* What exactly are regular speech frames, SID frames, silence frames
(a "silence frame" for FR1 codec is NOT the same thing as a SID frame!)
and bad frame gaps, and which of these categories are allowed or not
allowed to exist at each of the interfaces in the spec-defined GSM
architecture.
* Which transformations are supposed to happen where: which network
elements are responsible for bad frame handling, error concealment,
comfort noise insertion or SID propagation.
* How these architectural principles, originally defined for the T1/E1
environment with TRAUs, can be carried over to an RTP environment.
* Relevant Osmocom components: OsmoBTS and the aspect of OsmoMGW that
interfaces from RTP to T1/E1 Abis.
* What behavior changes have been effected by my patches to OsmoBTS and
supporting libraries that have already been merged, and which behavior
changes are still on my wish list or to-do list to implement and
hopefully get merged.
Looking at the OsmoDevCall wiki page, I see absolutely nothing
scheduled past May, and there was no OsmoDevCall in April - are we out
of presenters? But we have just one problem: it seems that some
people in the senior leadership of Osmocom organization don't want me
presenting on OsmoDevCall, and recently even asked specifically for
presentation ideas from "anyone other than Mychaela". I see two
possible solutions to this problem:
Option 1: If the leaders in question could set aside their personal
dislike of me and allow me to present on highly Osmocom-relevant
topics (such as the FR/HR/EFR codec presentation proposal above) no
different from other Osmocom developers, that would be the best
solution.
Option 2: If those who control the scheduling of presentations on the
official OsmoDevCall platform (the official BigBlueButton instance for
ODC) are not willing to budge, the alternative will be for me and Das
Signal (my dear friend and FreeCalypso sysadmin) to set up our own BBB
instance on our own server, configure it to look and feel exactly like
the official one used for ODC, hold presentations there during those
months when no official ODC presentation takes place due to lack of
non-excluded willing presenters, and invite everyone from Osmocom to
join those unofficial ODC-like presentations.
So - which of the two is it going to be?
Sincerely,
Mother Mychaela,
operator of a non-profit GSM network based on Osmocom,
contributing to Osmocom CNI development in conjuction with that
network operation.
Hello GSM community,
I just put a new release of Themyscira GSM codec libraries and
utilities package:
ftp://ftp.freecalypso.org/pub/GSM/codecs/gsm-codec-lib-r2.tar.bz2ftp://ftp.freecalypso.org/pub/GSM/codecs/gsm-codec-lib-latest.tar.bz2
(symlink)
The two libraries in this package (libgsmefr and libgsmfrp) are
intended for people who develop gateway software interconnecting
Osmocom-based GSM networks to PSTN or other networks, gateways which
include a speech transcoding function that terminates the GSM codec
leg.
If anyone is currently interconnecting an Osmocom GSM voice network to
the outside world using software which you did not write yourself
(Asterisk, FreeSwitch, Kamailio, whatever) and you care about the plain
old FR codec, and/or care about EFR, beyond just AMR, I encourage you
to investigate your current non-Osmocom gateway software to see exactly
how it implements FR and EFR. Because there were NO pre-existing FOSS
libraries that correctly implement FR and EFR decoding prior to my
Themyscira gsm-codec-lib development, most pre-existing gateway
software probably implements these codecs in a flawed manner:
FR codec: Everyone to my knowledge implements this codec using classic
libgsm, a library that dates back to 1990s. It's a good library, it's
a fully correct implementation of GSM 06.10 spec, and I use it too.
However, it implements _only_ a bare 06.10 encoder and a bare 06.10
decoder, without any DTX functions of GSM 06.31 and related specs. In
the encoder direction having no DTX isn't really a problem (you won't
be able to do DTXd anyway unless you got lots of spectrum and are
running multi-ARFCN cells), but the lack of an Rx DTX handler per GSM
06.31 *is* a real problem: if you feed the uplink from a GSM call (RTP
stream from a BTS) to a bare GSM 06.10 decoder such as gsm_decode()
function in libgsm, you won't get correct handling of SID frames,
which every standard GSM MS will transmit, and you won't get correct
handling of BFI frame gaps, which will always occur. The correct
solution is to insert a call to a GSM 06.31 Rx DTX handler (it is more
than an ECU) just before the call to gsm_decode(), and my libgsmfrp
offering is that GSM 06.31 Rx DTX handler.
EFR codec: Everyone other than me implements EFR (if they support it
at all) using an AMR library such as libopencore-amrnb. I have seen
totally broken implementations that schlep 244-bit payloads directly
between supposed-to-be-EFR RTP and the AMR library, without reordering
those bits per gsm690_12_2_bitorder[] - those implementation have
exactly zero chance of ever actually working with a real GSM-EFR MS on
the other end - and I've also seen implementations that do perform this
bit reordering and are thus closer to correct. But even the latter
implementations are still wrong when it comes to SID handling: EFR is
equivalent to the highest MR122 mode of AMR only for regular speech
frames, but not for SID. There does exist a special encoding format for
representing GSM-EFR SID in AMR frame interfaces, but libopencore-amrnb
does not support GSM-EFR SID in any way at all. If you take the uplink
from a GSM-EFR call and feed it to libopencore-amrnb decoder, any time
the GSM MS emits a SID frame, strange noise sounds will appear at the
output of that decoder, instead of the correct comfort noise.
Themyscira libgsmefr is a proper encoder and decoder library for EFR,
based on the EFR reference code from ETSI, in exactly the same way how
libopencore-amrnb is based on the AMR reference code from ETSI/3GPP.
It still has some performance problems which I will be working on later
(the goal of getting it to perform no worse than libopencore-amrnb has
not been achieved yet), but at least it is correct.
Hasta la Victoria, Siempre,
Mychaela aka The Mother
Dear Harald,
A long time passed since I worked with the Nokia Site family and
OpenBSC. I managed to save an UltraSite cabinet from scrap, so I try
to revive it for a museum.
On the old NITB versions I managed to make this work once, now I am
trying with the new (at least to me) Osmo-BSC implementation.
To keep it simple, only one TRX is configured:
OML <--> E1 TS 1 (64kbit)
RSL <--> E1 TS 2 (64kbit)
TRXSIG <--> E1 TS 3 and 4
DAHDI is used with a Digium Wildcard TE110P T1/E1 Board.
Osmo-BSC is able to do the OML bootstrap, but the RSL waits for LAPD endlessly.
My first question is: should Osmo-BSC be able to bootstrap the BTS
fully (all the way to "on air" mode) if it is not (yet) connected to
any other core element (MGW, MSC, STP) ?
This is the Osmo-BSC log (after the NOKIA_BTS_RESET command + the
reset_wait_time passed):
DLLAPD input/lapd.c:245 (0:1-T1-S62): LAPD Allocating SAP for SAPI=62
/ TEI=1 (d l=0x56284cfbd220, sap=0x56284cfbd200)
DLLAPD input/lapd.c:255 (0:1-T1-S62): k=1 N200=3 N201=260 T200=1.0 T203=10.0
DLLAPD input/lapd.c:519 (0:1-T1-S62): LAPD DL-ESTABLISH request TEI=1 SAPI=62
DLLAPD input/lapd.c:654 (0:1-T1-S62) LAPD DL-ESTABLISH confirm TEI=1 SAPI=62
DNM bts_nokia_site.c:63 (bts=0) bootstrapping OML
DNM bts_nokia_site.c:1729 (bts=0) Rx ABIS_OM_MDISC_FOM
DNM bts_nokia_site.c:1573 (bts=0) Rx (0x82) NOKIA_BTS_OMU_STARTED
DNM bts_nokia_site.c:1583 (bts=0) Rx BTS type = 17 (UltraSite GSM 900)
DNM bts_nokia_site.c:1098 (bts=0) Sending NOKIA_BTS_START_DOWNLOAD_REQ
DNM bts_nokia_site.c:1729 (bts=0) Rx ABIS_OM_MDISC_FOM
DNM bts_nokia_site.c:1573 (bts=0) Rx (0x84) NOKIA_BTS_MF_REQ
DNM bts_nokia_site.c:1729 (bts=0) Rx ABIS_OM_MDISC_FOM
DNM bts_nokia_site.c:1573 (bts=0) Rx (0x88) NOKIA_BTS_CONF_REQ
DNM bts_nokia_site.c:1098 (bts=0) Sending NOKIA_BTS_ACK
DNM bts_nokia_site.c:1260 (bts=0) Sending multi-segment 0
DNM bts_nokia_site.c:1260 (bts=0) Sending multi-segment 1
DNM bts_nokia_site.c:1729 (bts=0) Rx ABIS_OM_MDISC_FOM
DNM bts_nokia_site.c:1573 (bts=0) Rx (0x81) NOKIA_BTS_ACK
DNM bts_nokia_site.c:1604 (bts=0) Rx ACK = 1
DLLAPD input/lapd.c:245 (0:2-T1-S0): LAPD Allocating SAP for SAPI=0 /
TEI=1 (dl= 0x56284d252a20, sap=0x56284d252a00)
DLLAPD input/lapd.c:255 (0:2-T1-S0): k=2 N200=3 N201=260 T200=1.0 T203=10.0
DLLAPD input/lapd.c:519 (0:2-T1-S0): LAPD DL-ESTABLISH request TEI=1 SAPI=0
DLLAPD lapd_core.c:421 (0:2-T1-S0) sending MDL-ERROR-IND cause 1 from
state LAPD _STATE_IDLE
DLLAPD input/lapd.c:658 (0:2-T1-S0) LAPD DL-RELEASE indication TEI=1 SAPI=0
DLLAPD input/lapd.c:282 (0:2-T1-S0): LAPD Freeing SAP for SAPI=0 /
TEI=1 (dl=0x5 6284d252a20, sap=0x56284d252a00)
DCHAN lchan_fsm.c:1779
lchan(0-0-0-CCCH_SDCCH4-0)[0x56284d251770]{UNUSED}: (type =NONE) lchan
allocation failed in state UNUSED: LCHAN_EV_TS_ERROR
DCHAN lchan_fsm.c:197
lchan(0-0-0-CCCH_SDCCH4-0)[0x56284d251770]{UNUSED}: (type= NONE) lchan
activation failed (lchan allocation failed in state UNUSED: LCHAN_EV
_TS_ERROR)
DCHAN lchan_fsm.c:1779
lchan(0-0-0-CCCH_SDCCH4-1)[0x56284d2519b0]{UNUSED}: (type =NONE) lchan
allocation failed in state UNUSED: LCHAN_EV_TS_ERROR
DCHAN lchan_fsm.c:197
lchan(0-0-0-CCCH_SDCCH4-1)[0x56284d2519b0]{UNUSED}: (type= NONE) lchan
activation failed (lchan allocation failed in state UNUSED: LCHAN_EV
_TS_ERROR)
DCHAN lchan_fsm.c:1779
lchan(0-0-0-CCCH_SDCCH4-2)[0x56284d251bf0]{UNUSED}: (type =NONE) lchan
allocation failed in state UNUSED: LCHAN_EV_TS_ERROR
DCHAN lchan_fsm.c:197
lchan(0-0-0-CCCH_SDCCH4-2)[0x56284d251bf0]{UNUSED}: (type= NONE) lchan
activation failed (lchan allocation failed in state UNUSED: LCHAN_EV
_TS_ERROR)
DCHAN lchan_fsm.c:1779
lchan(0-0-0-CCCH_SDCCH4-3)[0x56284d251e30]{UNUSED}: (type =NONE) lchan
allocation failed in state UNUSED: LCHAN_EV_TS_ERROR
DCHAN lchan_fsm.c:197
lchan(0-0-0-CCCH_SDCCH4-3)[0x56284d251e30]{UNUSED}: (type= NONE) lchan
activation failed (lchan allocation failed in state UNUSED: LCHAN_EV
_TS_ERROR)
Would be nice to make this old beast running again.
Much appreciate any and all help.
Regards,
Csaba
Hello Osmocom,
I know a lot of people here have salvaged T1/E1 BTS equipment from
Nokia, Ericsson etc. But what about the next level up - has anyone
been able to salvage a classic T1/E1 BSC that goes with those BTSes?
And given the hardware, does anyone in our community know how to get
one of those beasts working?
I am interested in the TRAU component of the classic GSM BSS
architecture, and I would really love to lay my hands (remotely, via
OCTOI, would be just fine) on one of those beauties. Specifically, I
seek to feed custom-crafted bits to the TRAU's Abis input and capture
what it puts out on the A interface G.711 side, and vice-versa.
What can be learned from such experiments? Several things:
* I would love to play with TFO: see the TFO_REQ in-band signaling
messages the TRAU should put out on its own during the first 5 s or
so, then send our own TFO_REQ and TFO_ACK to the TRAU, do the whole
protocol, and get the TRAU to actually enter TFO mode. Reading the
spec is one thing, but seeing it in action would be so much more fun!
I've also been wanting to write my own FOSS implementation of in-band
TFO within G.711 RTP, but it would be an impractical task without
having some other existing implementation to test against.
* If we can get TFO to work, we'll be able to see exactly how real
TRAUs handled the onerous requirements of TS 28.062 section C.3.2.1.1.
Implementing those rules for FR1 would be quite easy, but try doing
the same for EFR or HR1 - *very* daunting! It would be lovely to see
exactly what actual historical implementations did here.
* Outside of TFO, we should be able to get the TRAU into a known state
by feeding it spec-defined encoder and decoder homing frames, and then
craft our own test sequences (beyond the standard ones it was surely
tested with by its designers) to exercise those parts of the codec
implementation where the specs allow implementors to innovate,
particularly everything to do with error concealment.
But doing all of the above requires access to some old-style T1/E1 BSC
that contains such a TRAU. Does anyone in our community have access
to such hw?
M~
Should I change the way I do private branches in osmocom?
I push a lot of private branches everywhere. I was asked in PM if I could cut
down on branches a bit because it clutters other developers' view of the git
history. My immediate response was: the other developer should simply not fetch
my branches, or invoke tig or gitk in a way that shows only selected branches.
But I reflected a bit and would like to ask generally how we want to do it.
For osmocom it apparently is mostly me pushing private branches a lot. What if
we all did that...
In linux kernel development it seems to be more like each developer has her own
public repository to make a mess in.
So, i could make git clones of our main repositories in gitea and keep my
private branches there. It seems like maybe i should do that out of common
courtesy.
But it also adds a bunch of overhead for me, keeping separate repositories
synced. Having multiple remotes affects git commandline behavior. I used to
have separate fetch/push URLs for a while, but it was annoying in some ways.
I can change my ways, but only if i really have to.
Any opinions? Are my branches annoying?
Aspects:
- backup of my ongoing work. (daily)
- offering preliminary work to customers for manual build. (weekly)
- seeing what others are up to. (rare but happens)
- limiting branch clutter. (all the time for everyone)
thanks!
~N