Below is a measurement on my "most distant" BTS, BTS 1. From what I see, downlink is 35 dB
below uplink, which of course is natural when the MS can produce 30 dBm, and the BTS which
in this case uses a bareback Limesdr min, can only produce 10 dBm.
the bsc parameter ms max power is set to 15 for both bts 0 and bts 1.
However, it seems MS output power is *much* stronger.
Further, my understanding is that it is the BTS level measurements that decides when to handover,
i.e. measurements in the upstream, and not the MS receive levels...
Since there is such a mismatch between the micro BTS and the MS, maybe the RX should be attenuated,
to bring upsteram levels more equal to downstream.
OsmoBSC# show conns Active subscriber connections: conn ID=6, MSC=0, hodec2_fail=0, mgw_ep=rtpbridge/1@mgw BTS 0, TRX 0, Timeslot 1, Lchan 0: Type TCH_F Connection: 1, State: ESTABLISHED BS Power: 20 dBm, MS Power: 16 dBm Channel Mode / Codec: SPEECH_V1 No Subscriber Bound IP: 127.0.0.1 Port 16390 RTP_TYPE2=0 CONN_ID=0 Conn. IP: 192.168.1.70 Port 4186 RTP_TYPE=3 SPEECH_MODE=0x00 Measurement Report: Flags: MS Timing Offset: 0 L1 MS Power: 16 dBm, Timing Advance: 0 RXL-FULL-dl: -110 dBm, RXL-SUB-dl: -110 dBm RXQ-FULL-dl: 5, RXQ-SUB-dl: 5 RXL-FULL-ul: -74 dBm, RXL-SUB-ul: -73 dBm RXQ-FULL-ul: 3, RXQ-SUB-ul: 1
What are *your* experiences with "power control" and with "handover".
Best Regards,
Gullik
Hi Gullik,
On Sat, Jan 12, 2019 at 08:22:05PM +0100, Gullik Webjorn wrote:
Below is a measurement on my "most distant" BTS, BTS 1. From what I see, downlink is 35 dB
What do you mean by "downlink is 35dB"?
below uplink, which of course is natural when the MS can produce 30 dBm, and the BTS which
in this case uses a bareback Limesdr min, can only produce 10 dBm.
If you are using a general-purpose SDR hardware you cannot expect that any of the signal levels written anywhere actually meany anything at all. There is no absolute levels reported by SDR hardware anywhere, and there is no calibration of either transmit nor receiver. This means: * there's no general calibration curve for the chip/board * there's no per-unit individual calibration curve for the unit you have
This is *very* different from a real GSM base station. Without designing a calibration procedure for the above, as well as some mechanism to apply it in production, I don't think one can expect any of the readings to state anything realistic, nor expect any power control loops to operate.
Please don't get me wrong, general-purpose SDRs such as USRPs or LimeSDR are great tools for experiments in the lab. But that's what they are - at least for the time being. There is a *big* difference between a real-world base station and a GP-SDR board.
the bsc parameter ms max power is set to 15 for both bts 0 and bts 1. However, it seems MS output power is *much* stronger.
The 16dBm is probably the closest one can get to the 15 dBm you requested:
L1 MS Power: 16 dBm, Timing Advance: 0
How do you establish this fact? Did you attach a RF power meter to the MS output and masure the output power?
Hi Harald,
Just a fun fact: afaik current commercial multi-purpose (2G/3G/4G) base stations use generic SDRs to accomplish support for all technologies in a compact package. Of course those SDRs are in a completely different league in terms of accuracy, calibration and price :).
Cheers, Domi
2019. jan. 13. dátummal, 10:30 időpontban Harald Welte laforge@gnumonks.org írta:
Hi Gullik,
On Sat, Jan 12, 2019 at 08:22:05PM +0100, Gullik Webjorn wrote: Below is a measurement on my "most distant" BTS, BTS 1. From what I see, downlink is 35 dB
What do you mean by "downlink is 35dB"?
below uplink, which of course is natural when the MS can produce 30 dBm, and the BTS which
in this case uses a bareback Limesdr min, can only produce 10 dBm.
If you are using a general-purpose SDR hardware you cannot expect that any of the signal levels written anywhere actually meany anything at all. There is no absolute levels reported by SDR hardware anywhere, and there is no calibration of either transmit nor receiver. This means:
- there's no general calibration curve for the chip/board
- there's no per-unit individual calibration curve for the unit you have
This is *very* different from a real GSM base station. Without designing a calibration procedure for the above, as well as some mechanism to apply it in production, I don't think one can expect any of the readings to state anything realistic, nor expect any power control loops to operate.
Please don't get me wrong, general-purpose SDRs such as USRPs or LimeSDR are great tools for experiments in the lab. But that's what they are - at least for the time being. There is a *big* difference between a real-world base station and a GP-SDR board.
the bsc parameter ms max power is set to 15 for both bts 0 and bts 1. However, it seems MS output power is *much* stronger.
The 16dBm is probably the closest one can get to the 15 dBm you requested:
L1 MS Power: 16 dBm, Timing Advance: 0
How do you establish this fact? Did you attach a RF power meter to the MS output and masure the output power?
--
- Harald Welte laforge@gnumonks.org http://laforge.gnumonks.org/
============================================================================ "Privacy in residential applications is a desirable marketing option." (ETSI EN 300 175-7 Ch. A6)
Hi Domi,
On Sun, Jan 13, 2019 at 11:17:33AM +0100, Tomcsányi, Domonkos wrote:
Just a fun fact: afaik current commercial multi-purpose (2G/3G/4G) base stations use generic SDRs to accomplish support for all technologies in a compact package.
Sure. I think it's not so much the size of the package, but mostly the software reconfiguration part for so-called "spectrum re-farming". This allows you to scale down the number of GSM carriers and scale up the number/width of LTE carriers without any on-site visits.
Of course those SDRs are in a completely different league in terms of accuracy, calibration and price :).
When I said "general-pursose SDR" I was referring to a USRP-style or LimeSDR-style device. This means, basically 1) wide-band radio front-ends without band filters 2) IF or baseband filters that are muhc wider than GSM channels 3) all processing on general-purpose CPUs (as opposed to ASICs, DSPs or FPGAs) 4) no calibration of the RF output power over frequency and temperauture 5) no calibration of the Rx signal level over frequency and temperature 6) no clock that conforms even remotely to 3GPP accuracy requirements
Of course not all of those topics are realted to the original discussion, and of course you can still use a general-purpose SDR as one element of a real-world base station. But then you have to add at least some of the missing elements I described above.
For the calibration part: This could all be done in [open source software], if you have a signal generator with known/defined signal level. It would be a great project to define a calibration setup to determine calibration tables for Rx and Tx levels over (at least) frequency.
OsmoBTS already has support for applying calibration tables on the transmit side, which were originally intended in case you add an external, non-linear-gain PA to a previously calibrated BTS. That could possibly be extended? For the receive side, no support exist.
Regards, Harald
Thanks Harald,
On 2019-01-13 10:24, Harald Welte wrote:
Hi Gullik,
On Sat, Jan 12, 2019 at 08:22:05PM +0100, Gullik Webjorn wrote:
Below is a measurement on my "most distant" BTS, BTS 1. From what I see, downlink is 35 dB
What do you mean by "downlink is 35dB"?
This is what osmocom thinks, if I understand correctly.
110 -74 = 36 (ok, i was careless in subtracting)
i.e. osmocom thinks that the level it sees is -74, and the mobile reports a signal of -110 , Right??
RXL-FULL-dl: -110 dBm, RXL-SUB-dl: -110 dBm RXQ-FULL-dl: 5, RXQ-SUB-dl: 5 RXL-FULL-ul: -74 dBm, RXL-SUB-ul: -73 dBm RXQ-FULL-ul: 3, RXQ-SUB-ul: 1
below uplink, which of course is natural when the MS can produce 30 dBm, and the BTS which
in this case uses a bareback Limesdr min, can only produce 10 dBm.
Given that the mobile *can* produce up to 30 dBm, and the limesdr *can* produce 10 dBm,
the asymmetry makes sense. But 36 dB = 4000 !!
If you are using a general-purpose SDR hardware you cannot expect that any of the signal levels written anywhere actually meany anything at all. There is no absolute levels reported by SDR hardware anywhere, and there is no calibration of either transmit nor receiver. This means:
- there's no general calibration curve for the chip/board
- there's no per-unit individual calibration curve for the unit you have
This is *very* different from a real GSM base station. Without designing a calibration procedure for the above, as well as some mechanism to apply it in production, I don't think one can expect any of the readings to state anything realistic, nor expect any power control loops to operate.
Please don't get me wrong, general-purpose SDRs such as USRPs or LimeSDR are great tools for experiments in the lab. But that's what they are - at least for the time being. There is a *big* difference between a real-world base station and a GP-SDR board.
Yes, I do understand this, but thought that there *was* at least a power loop,
where osmobsc tells the mobile to increase or decrease power based on rx level.
For a particular SDR, it should be possible to have a rough calibration based on
the similarity of devices. Also, manual attenuation to bring levels within reasonable
range can be done with sdr gain adjust or attenuation pads.
Some years ago when I was playing with OpenBTS, i spent a lot of time trying to get
the correct TX level and RX gain, so that up and downlink were within some reasonable
levels, and the BER was OK.
Being a novice with osmobts, I am aiming for the same thing, eliminating small issues one at the time.
the bsc parameter ms max power is set to 15 for both bts 0 and bts 1. However, it seems MS output power is *much* stronger.
The 16dBm is probably the closest one can get to the 15 dBm you requested:
This is fine by me, and should I interpret this as the power loop in osmocom *works*, and
that it has actually downregulater mobile power output to a "correct" value?
Or, has the BSC just set the power to 15 db ( as in the config file) without regard of the
uplink level measured ( -74 dBm with uncertain accuracy )?
L1 MS Power: 16 dBm, Timing Advance: 0
The 16 dBm, is what the mobile reports, right?
How do you establish this fact? Did you attach a RF power meter to the MS output and masure the output power?
I have not got to this point yet, but I have a suitable power meter, and a spectrum analyzer,
so there are tests to be done once I have the setup stable.
I am trying to get this to work the way you intended, so that enabling handover has a chance
of succeeding. Some figures *you* have been aiming for would be nice....
Again, thanks for your reply, this is really fun.....and educational...
Best Regards,
Gullik
Hi Gullik,
On Sun, Jan 13, 2019 at 01:32:25PM +0100, Gullik Webjorn wrote:
On Sat, Jan 12, 2019 at 08:22:05PM +0100, Gullik Webjorn wrote:
Below is a measurement on my "most distant" BTS, BTS 1. From what I see, downlink is 35 dB
What do you mean by "downlink is 35dB"?
This is what osmocom thinks, if I understand correctly.
110 -74 = 36 (ok, i was careless in subtracting)
i.e. osmocom thinks that the level it sees is -74, and the mobile reports a signal of -110 , Right??
Yes. However, the big difference is that the mobile has been calibrated to (if I remember) 2dB accuracy, while the GP-SDR based BTS has not been at all.
Given that the mobile *can* produce up to 30 dBm, and the limesdr *can* produce 10 dBm,
FYI: I've never seen a LimeSDR generate anywhre near 10dBm of a GSM carrier.
the asymmetry makes sense. But 36 dB = 4000 !!
Yes, if such readings were correct, you'd have an unbalanced link. I'm still not really clear what you're trying to say here. I somehow have the feeling that between the lines you're saying the weak downlink should have an influence on the uplink? That's now how GSM power control is specified.
Yes, I do understand this, but thought that there *was* at least a power loop, where osmobsc tells the mobile to increase or decrease power based on rx level.
yes, there is. There's an uplink target receive level to which the power control loop will try to adjust. This is -75 dBm by default, and can be configured using 'uplink-power-target' at the BTS node of the configuration.
For a particular SDR, it should be possible to have a rough calibration based on the similarity of devices.
yes, it should be possible. Yet, nobody of the many users or proponents of related devices have been working on anything like that, to the best of my knowledge. I would love to see work in that area, and I'm honestly surprised that nobody with an interest in this has been doing so during the past almost 10 years of OpenBTS and OsmoBTS being out there. At least not in the public.
This is fine by me, and should I interpret this as the power loop in osmocom *works*, and that it has actually downregulater mobile power output to a "correct" value?
If the "L1 MS Power" as seen in "show lchan" changes depending on your path loss, then the power control loop is working. You can also enable "logging level loop debug" to see it in action.
Or, has the BSC just set the power to 15 db ( as in the config file) without regard of the uplink level measured ( -74 dBm with uncertain accuracy )?
Simply adjust your path loss by increasing attenuation between BTS and MS and see if the L1 MS Power increments.
Please note that "-74 dBm" are by no means -74dBm at all due to the lack of any calibration. However, as long as OsmoBTS thinks it's -74 dBm, it thinks it's almost exactly at the target receive level of the uplink power control loop.
L1 MS Power: 16 dBm, Timing Advance: 0
The 16 dBm, is what the mobile reports, right?
How do you establish this fact? Did you attach a RF power meter to the MS output and masure the output power?
I have not got to this point yet, but I have a suitable power meter, and a spectrum analyzer,
so there are tests to be done once I have the setup stable.
I am trying to get this to work the way you intended, so that enabling handover has a chance of succeeding. Some figures *you* have been aiming for would be nice....
Seriously: We are aiming for "figures" that reflect absolute power levels, everything else feels like stumbling in the dark.
I think that the best way to go ahead is to ensure that OsmoBTS is able to deal with loosely calibated (let's say compensated? absolute?) values. All the algoriths deal with that as input data, and trying to get them working without getting that first stepping stone right feels like trying to build the first floor without the foundation first.
Regards, Harald