R820T "spur prevention" code

This is merely a historical archive of years 2008-2021, before the migration to mailman3.

A maintained and still updated list archive can be found at https://lists.osmocom.org/hyperkitty/list/osmocom-sdr@lists.osmocom.org/.

Per Baekgaard baekgaard at b4net.dk
Mon Apr 15 10:45:45 UTC 2013


List,

As a new member here, only recently having acquired a few SDR units, I
am not sure if this is the right place to post the following. If not,
feel free to advice me of a better way (for reference, I've tried to
mail a few of the key people here already directly, but I guess that was
not the correct way to approach it).

R820T tuning problems
==============
As far as I can tell, the "correct" way to tune a RTL SDR device is to
call rtlsdr_set_center_freq, which in turn invokes the set_freq function
of the driver (mapped to r820t_set_freq which further calls
r820t_SetRfFreqHz in case of the R820T).

This may not always result in the desired frequency being tuned to. In
some cases, such as with the current tuner_r820t.c driver, it can lead
to rather large "errors" *). For instance, trying to tune to 144.000 MHz
works as intended -- but tuning to anything in the range 144.002 to
144.058 will lead to center frequency of 144.030 MHz, or in other words
an error as large of 28 kHz. [If you apply a PPM value other than 0, the
exact range will be a little different.] I've also seen cases where use
of the "kalibrate" program to locate GSM channels has resulted in one
actual GSM channel being detected on 3 adjacent channels, 2 of which are
incorrect -- and the calculated frequency deviation is also often wrong
on some channels. For GSM channels and other frequencies above 885 MHz,
the tuning "error" can be as much +/- ~225 kHz.

Now, the way for an application to verify the resulting tuning would be
to call the rtlsdr_get_center_freq. However, in the current
implementation, this will return the requested tuning (adjusted with any
calculated offsets set in the librtlsdr.c driver -- which is not
applicable for the R820T) and not the actual tuning.

Further, I've checked a couple of GUI programs on how they use the rtl
library. One (linrad) only calls the rtlsdr_set_center_freq function and
newer checks the resulting tuning. Another (gqrx via the osmo library)
does a similar thing, but has a note "FIXME: Read back frequency?"
comment. Also, the kalibrate suite does not check for the actual tuned
frequency either. So all of these applications will report an incorrect
frequency of the spectrum, and the error depends on what center
frequency you may have requested.

So -- unless I have overlooked something, which could be entirely
possible -- the current R820T driver sometimes detunes the requested
center frequency and the applications using it are not checking for
this. And if they had wanted to do so, they have no means to with the
current codebase.


Proposed "fix"
========
I understand the R820T driver is messy anyway and that its origin has
caused it to be what it is.

The problem above is caused by a few lines of code in tuner_r820t.c
around line 1460, that does "spur prevention". The idea is that if the
resulting VCO frequency is too close to a harmonics of the xtal/2
frequency, the VCO is detuned to run exactly on top of the harmonics, to
reduce any spurs within a (450/N) kHz bandwith, where N is the MixDiv,
i.e. the divisor used to derive the center frequency from the VCO that
runs between 1.77 and 3.54 GHz.

This probably makes a lot of sense for applications where the exact
tuning is not too important. Likely when using the stick for DVB-T as
originally intended, the later stages will compensate for the detuning
and just benefit from the reduced amount of spurs.

But for our use, it might make less sense and I guess most of the other
drivers have not implemented something like this anyway?

I can think of the following ways to fix this issue:

1) Disable the code that does the detuning (either temporarily (#if
0/#endif) or by some setup parameter so that an application can
specifically ask for the "detune" behaviour). This will possibly
introduce additional spur on the R820T, in the presence of strong
"near-center frequency" signals, but will fix the problem for existing
SDR applications.

In case of an optional "setup" parameter, proper handling of the
detuning and reporting back to the calling application has to be made
available, and those applications then eventually need to implement some
way of handling it.

2) Merge with the possibly somewhat similar "offset" tuning method
implemented for some other devices -- although I'm not sure exactly how
that is intended to work and if it truly is compatible with the R820T
method.

3) Keep the code as is, but "tweak" the sampling in the 2832 device to
compensate for the offset tuning. So in other words, instead of assuming
the nominal IF of 3.57 MHz, tune to an offset IF. This will obviously
skew the BW and may not work for smaller sampling rates, where the
offset could actually be larger than what the sampling rate allows...


I would propose to simply disable the code for now (and I'm happy to
submit a small patch), as the detuning currently appears to introduce
more problems than it solves. A 2nd step would then be to add a proper
report-back abstraction/functionality to all drivers and make the
librtlsdr.c code fully aware of the detuning and allow it to be reported
back to the caller, and then ask the application developers to start
using the get_center_freq calls -- at least if they have asked for
"enable_spur_prevention" or something similar.


Thanks for any comments/advice on the above,


-- Per.





More information about the osmocom-sdr mailing list