rtlsdr in async mode is returning buffers full with data from previous calls.
I noticed that sometimes the first async callbacks would return TOO
QUICKLY so that it would be impossible to have sampled that data. This
might be because lib-usb have previous transfers submitted and not
cleared?
I compiled rtl_test with a small modification that will print the
sampling time if it is much lower than expected.
BUF_LENGTH is (16 * 16384)=256K, and SAMPLE_RATE is 2048000, so a full
buffer should take about 64ms.
You can see that in this example the async callback is returned after
7.79ms (and the following one 1.47ms later). So the data must be old.
The ns printf
nuno@ubuntu:~/Desktop/rtl-sdr/build/src$ ./rtl_test -p
Found 1 device(s):
0: ezcap USB 2.0 DVB-T/DAB/FM dongle
Using device 0: ezcap USB 2.0 DVB-T/DAB/FM dongle
Found Rafael Micro R820T tuner
Supported gain values (29): 0.0 0.9 1.4 2.7 3.7 7.7 8.7 12.5 14.4 15.7
16.6 19.7 20.7 22.9 25.4 28.0 29.7 32.8 33.8 36.4 37.2 38.6 40.2 42.1
43.4 43.9 44.5 48.0 49.6
Reporting PPM error measurement every 10 seconds...
Press ^C after a few minutes.
Reading samples in async mode...
ERROR ns: 7791886
ERROR ns: 9259301
lost at least 185 bytes
real sample rate: 2073369
real sample rate: 2048190
^CSignal caught, exiting!
User cancel, exiting...
Cumulative PPM error: -289
The modifications to demonstrate the issue:
nuno@ubuntu:~/Desktop/rtl-sdr/src$ git diff
diff --git a/src/rtl_test.c b/src/rtl_test.c
index f5a56b8..1935bfb 100644
--- a/src/rtl_test.c
+++ b/src/rtl_test.c
@@ -138,9 +138,11 @@ static void rtlsdr_callback(unsigned char *buf, uint32_t le
ppm_now.tv_sec = tv.tv_sec;
ppm_now.tv_nsec = tv.tv_usec*1000;
#endif
- if (ppm_now.tv_sec - ppm_recent.tv_sec > PPM_DURATION) {
- ns = 1000000000L * (int64_t)(ppm_now.tv_sec - ppm_recent.tv_sec)
- ns += (int64_t)(ppm_now.tv_nsec - ppm_recent.tv_nsec);
+ ns = 1000000000L * (int64_t)(ppm_now.tv_sec - ppm_recent.tv_sec);
+ ns += (int64_t)(ppm_now.tv_nsec - ppm_recent.tv_nsec);
+ if(ns<60000000)
+ printf("ERROR ns: %9lld\n",ns);
+ if (ppm_now.tv_sec - ppm_recent.tv_sec > PPM_DURATION) {
printf("real sample rate: %i\n",
(int)((1000000000L * ppm_count / 2L) / ns));
#ifndef __APPLE__
Nuno
Hi
I have an Elonics 4000 USB receiver, and I-m trying to do some
measurements on the digital television received power in my area.
Everything works fine, except that the received power is never lower than
-46 dB, even on channel which are not occupied. I was expecting to get
values like -120 dB or something similar, instead in my measurements the
lowest point in any portion of the spectrum is circa -46 dB.
Am I missing something?
Thank you
Best
Hello,
I use rtl_sdr to collect some data into a bin file(file.bin).
As from the wiki, I think it is parsed like this :
/fd = open("file.bin","r") ;//
// char I[];//
// char Q[];//
////loop ://
// read(fd, I, 1) ;//
// read(fd, Q,1) ;//
// endloop//
// char *p = Q ;//
// loop ://
// printf("%f\n", (float*)p);//
// p += 4 ;//
////endloop/
But this code can only print 0.00000
I don't know where the error is.
Please help me.
Thank you.
Hi,
I believe through this mailaddress I can reach the persons that have written the excellent
wiki on the installation of the realtek receiver for it to receive ADS-B signals.
I am a vivid Linux 'customer' but not a programmer. Yet, using the
instructions I have been able to get a signal on my pc.
So thanks for a well written wiki!
Best regards,
Marco
Ps: the most difficult issue to tackle was to find an address for this 'thank you' note ;-)
List,
As a new member here, only recently having acquired a few SDR units, I
am not sure if this is the right place to post the following. If not,
feel free to advice me of a better way (for reference, I've tried to
mail a few of the key people here already directly, but I guess that was
not the correct way to approach it).
R820T tuning problems
==============
As far as I can tell, the "correct" way to tune a RTL SDR device is to
call rtlsdr_set_center_freq, which in turn invokes the set_freq function
of the driver (mapped to r820t_set_freq which further calls
r820t_SetRfFreqHz in case of the R820T).
This may not always result in the desired frequency being tuned to. In
some cases, such as with the current tuner_r820t.c driver, it can lead
to rather large "errors" *). For instance, trying to tune to 144.000 MHz
works as intended -- but tuning to anything in the range 144.002 to
144.058 will lead to center frequency of 144.030 MHz, or in other words
an error as large of 28 kHz. [If you apply a PPM value other than 0, the
exact range will be a little different.] I've also seen cases where use
of the "kalibrate" program to locate GSM channels has resulted in one
actual GSM channel being detected on 3 adjacent channels, 2 of which are
incorrect -- and the calculated frequency deviation is also often wrong
on some channels. For GSM channels and other frequencies above 885 MHz,
the tuning "error" can be as much +/- ~225 kHz.
Now, the way for an application to verify the resulting tuning would be
to call the rtlsdr_get_center_freq. However, in the current
implementation, this will return the requested tuning (adjusted with any
calculated offsets set in the librtlsdr.c driver -- which is not
applicable for the R820T) and not the actual tuning.
Further, I've checked a couple of GUI programs on how they use the rtl
library. One (linrad) only calls the rtlsdr_set_center_freq function and
newer checks the resulting tuning. Another (gqrx via the osmo library)
does a similar thing, but has a note "FIXME: Read back frequency?"
comment. Also, the kalibrate suite does not check for the actual tuned
frequency either. So all of these applications will report an incorrect
frequency of the spectrum, and the error depends on what center
frequency you may have requested.
So -- unless I have overlooked something, which could be entirely
possible -- the current R820T driver sometimes detunes the requested
center frequency and the applications using it are not checking for
this. And if they had wanted to do so, they have no means to with the
current codebase.
Proposed "fix"
========
I understand the R820T driver is messy anyway and that its origin has
caused it to be what it is.
The problem above is caused by a few lines of code in tuner_r820t.c
around line 1460, that does "spur prevention". The idea is that if the
resulting VCO frequency is too close to a harmonics of the xtal/2
frequency, the VCO is detuned to run exactly on top of the harmonics, to
reduce any spurs within a (450/N) kHz bandwith, where N is the MixDiv,
i.e. the divisor used to derive the center frequency from the VCO that
runs between 1.77 and 3.54 GHz.
This probably makes a lot of sense for applications where the exact
tuning is not too important. Likely when using the stick for DVB-T as
originally intended, the later stages will compensate for the detuning
and just benefit from the reduced amount of spurs.
But for our use, it might make less sense and I guess most of the other
drivers have not implemented something like this anyway?
I can think of the following ways to fix this issue:
1) Disable the code that does the detuning (either temporarily (#if
0/#endif) or by some setup parameter so that an application can
specifically ask for the "detune" behaviour). This will possibly
introduce additional spur on the R820T, in the presence of strong
"near-center frequency" signals, but will fix the problem for existing
SDR applications.
In case of an optional "setup" parameter, proper handling of the
detuning and reporting back to the calling application has to be made
available, and those applications then eventually need to implement some
way of handling it.
2) Merge with the possibly somewhat similar "offset" tuning method
implemented for some other devices -- although I'm not sure exactly how
that is intended to work and if it truly is compatible with the R820T
method.
3) Keep the code as is, but "tweak" the sampling in the 2832 device to
compensate for the offset tuning. So in other words, instead of assuming
the nominal IF of 3.57 MHz, tune to an offset IF. This will obviously
skew the BW and may not work for smaller sampling rates, where the
offset could actually be larger than what the sampling rate allows...
I would propose to simply disable the code for now (and I'm happy to
submit a small patch), as the detuning currently appears to introduce
more problems than it solves. A 2nd step would then be to add a proper
report-back abstraction/functionality to all drivers and make the
librtlsdr.c code fully aware of the detuning and allow it to be reported
back to the caller, and then ask the application developers to start
using the get_center_freq calls -- at least if they have asked for
"enable_spur_prevention" or something similar.
Thanks for any comments/advice on the above,
-- Per.