Thanks Harald,
On 2019-01-13 10:24, Harald Welte wrote:
Hi Gullik,
On Sat, Jan 12, 2019 at 08:22:05PM +0100, Gullik Webjorn wrote:
Below is a measurement on my "most
distant" BTS, BTS 1. From what I see,
downlink is 35 dB
What do you mean by "downlink is 35dB"?
This is what osmocom thinks, if I understand correctly.
110 -74 = 36 (ok, i was careless in subtracting)
i.e. osmocom thinks that the level it sees is -74, and the mobile
reports a signal of -110 , Right??
RXL-FULL-dl: -110 dBm, RXL-SUB-dl: -110 dBm RXQ-FULL-dl: 5,
RXQ-SUB-dl: 5
RXL-FULL-ul: -74 dBm, RXL-SUB-ul: -73 dBm RXQ-FULL-ul: 3,
RXQ-SUB-ul: 1
> below uplink, which of course is natural when the MS can produce 30 dBm, and
> the BTS which
>
> in this case uses a bareback Limesdr min, can only produce 10 dBm.
Given that the mobile *can* produce up to 30 dBm, and the limesdr *can*
produce 10 dBm,
the asymmetry makes sense. But 36 dB = 4000 !!
If you are using a general-purpose SDR hardware you
cannot expect that
any of the signal levels written anywhere actually meany anything at
all. There is no absolute levels reported by SDR hardware anywhere, and
there is no calibration of either transmit nor receiver. This means:
* there's no general calibration curve for the chip/board
* there's no per-unit individual calibration curve for the unit you have
This is *very* different from a real GSM base station. Without
designing a calibration procedure for the above, as well as some
mechanism to apply it in production, I don't think one can expect any of
the readings to state anything realistic, nor expect any power control loops
to operate.
Please don't get me wrong, general-purpose SDRs such as USRPs or LimeSDR
are great tools for experiments in the lab. But that's what they are -
at least for the time being. There is a *big* difference between a
real-world base station and a GP-SDR board.
Yes, I do understand this, but thought that there *was* at least a power
loop,
where osmobsc tells the mobile to increase or decrease power based on rx
level.
For a particular SDR, it should be possible to have a rough calibration
based on
the similarity of devices. Also, manual attenuation to bring levels
within reasonable
range can be done with sdr gain adjust or attenuation pads.
Some years ago when I was playing with OpenBTS, i spent a lot of time
trying to get
the correct TX level and RX gain, so that up and downlink were within
some reasonable
levels, and the BER was OK.
Being a novice with osmobts, I am aiming for the same thing, eliminating
small issues one at the time.
the bsc
parameter ms max power is set to 15 for both bts 0 and bts 1.
However, it seems MS output power is *much* stronger.
The 16dBm is probably the
closest one can get to the 15 dBm you
requested:
This is fine by me, and should I interpret this as the power loop in
osmocom *works*, and
that it has actually downregulater mobile power output to a "correct" value?
Or, has the BSC just set the power to 15 db ( as in the config file)
without regard of the
uplink level measured ( -74 dBm with uncertain accuracy )?
> L1 MS Power: 16 dBm, Timing Advance: 0
The 16 dBm, is what the mobile reports, right?
How do you establish this fact? Did you attach a RF
power meter to the
MS output and masure the output power?
I have not got to this point yet, but I have a suitable power meter, and
a spectrum analyzer,
so there are tests to be done once I have the setup stable.
I am trying to get this to work the way you intended, so that enabling
handover has a chance
of succeeding. Some figures *you* have been aiming for would be nice....
Again, thanks for your reply, this is really fun.....and educational...
Best Regards,
Gullik