Hi everyone,
although there are some comparisons between the R820T and the E4000
already [1, 2], I also did some tests with another use-case in mind.
I'm working on a thing similar to RTLSDR-Scanner [3]. I want to
monitor a large part of the spectrum continuously.
So I compared the R820T with the E4000 using RTLSDR-Scanner w/ and w/o
an antenna.
My results are here:
https://docs.google.com/folder/d/0ByDAKwyEiyx_XzZ5ZnpRV1VZWDQ/edit?usp=shar…
There's much more spurs with the E4000 than w/ the R820T. According to
[1, 2] one also would expect a better overall sensivity compared to
the E4000.
However, the GSM900 signals for example seem to be way better with the
E4000 according to the RTLSDR-Scanner. Tuning to a certain channel w/
OsmoSDR Source in GNUradio gives about the same SNR - contrary to the
RTLSDR-Scanner output. Can anyone explain this?
Also, the DVB-T channel at 502MHz is quite weak in the R820T
RTLSDR-Scanner output when compared to the E4000. I had a closer look
at the lower limit of the channel in gnuradio. This can be seen in the
502MHz_*.png pictures. The E4000 produces a nice +20dB step while one
can hardly see the channel in the R820T spectrum. I don't understand
this as well. Is this AGC-related? Manually setting a fixed gain
didn't really help though...
Any explanations?
Thank you!
Best regards,
Hunz
[1] http://steve-m.de/projects/rtl-sdr/tuner_comparison/
[2] http://www.hamradioscience.com/rtl2832u-r820t-vs-rtl2832u-e4000/#more-1852
[3] https://github.com/EarToEarOak/RTLSDR-Scanner
This is with a version of rtl-sdr I got by git last night and OpenBSD 5.2 (current release). 5.2 has some pthreads fixing so I waited until I bought another computer and loaded it. Are the crashes related to threads? I don't know, but possibly. It didn't work with OpenBSD 5.0 either.
rtl_fm crashes and uses threads
rtl_adsb crashes and uses threads
rtl_tcp doesn't crash, uses threads, actually stops on ctrl-c
rtl_test doesn't crash, doesn't use threads, won't stop
rtl_eeprom doesn't crash, doesn't use threads, ends normally
I'm not real practiced using gdb but I tried looking at a couple of core files, here's a run of rtl_fm:
rtl_fm -f 162550000 -N - | play -t raw -r 24k -e signed-integer -b 16 -c 1
-V1 -
-: (raw)
Encoding: Signed PCM
Channels: 1 @ 16-bit
Samplerate: 24000Hz
Replaygain: off
Duration: unknown
In:0.00% 00:00:00.00 [00:00:00.00] Out:0 [ | ] Clip:0
Found 1 device(s):
0: Realtek, RTL2838UHIDIR, SN: 00000013
Using device 0: ezcap USB 2.0 DVB-T/DAB/FM dongle
Found Rafael Micro R820T tuner
Oversampling input by: 42x.
Oversampling output by: 1x.
Buffer size: 8.13ms
Tuned to 162802000 Hz.
Sampling at 1008000 Hz.
Output at 24000 Hz.
Exact sample rate is: 1008000.009613 Hz
Tuner gain set to automatic.
In:0.00% 00:00:00.00 [00:00:00.00] Out:0 [ | ] Clip:0
Done.
Abort (core dumped)
d530# gdb -c rtl_fm.core /usr/local/bin/rtl_fm
GNU gdb 6.3
Copyright 2004 Free Software Foundation, Inc.
GDB is free software, covered by the GNU General Public License, and you are
welcome to change it and/or distribute copies of it under certain
conditions.
Type "show copying" to see the conditions.
There is absolutely no warranty for GDB. Type "show warranty" for details.
This GDB was configured as "i386-unknown-openbsd5.2"...
Core was generated by `rtl_fm'.
Program terminated with signal 6, Aborted.
Reading symbols from /usr/lib/libpthread.so.16.0...done.
Loaded symbols for /usr/lib/libpthread.so.16.0
Reading symbols from /usr/lib/libm.so.7.0...done.
Loaded symbols for /usr/lib/libm.so.7.0
Reading symbols from /usr/local/lib/librtlsdr.so.0.0...done.
Loaded symbols for /usr/local/lib/librtlsdr.so.0.0
Reading symbols from /usr/local/lib/libusb-1.0.so.1.0...done.
Loaded symbols for /usr/local/lib/libusb-1.0.so.1.0
Symbols already loaded for /usr/lib/libpthread.so.16.0
Reading symbols from /usr/lib/libc.so.65.0...done.
Loaded symbols for /usr/lib/libc.so.65.0
Loaded symbols for /usr/libexec/ld.so
#0 0x0abbd98d in kill () from /usr/lib/libc.so.65.0
(gdb) bt
#0 0x0abbd98d in kill () from /usr/lib/libc.so.65.0
#1 0x0ac29545 in abort () at /usr/src/lib/libc/stdlib/abort.c:68
#2 0x005e9298 in pthread_mutex_unlock (mutexp=0x3c003d8c)
at /usr/src/lib/librthread/rthread_sync.c:218
#3 0x1c00266e in full_demod (fm=0xcfa2de5c)
at /usr/src/misc/osmocom/2013-04-15/rtl-sdr/src/rtl_fm.c:583
#4 0x1c0028ff in demod_thread_fn (arg=0xcfa2de5c)
at /usr/src/misc/osmocom/2013-04-15/rtl-sdr/src/rtl_fm.c:641
#5 0x005ebc2e in _rthread_start (v=0x84da4c00)
at /usr/src/lib/librthread/rthread.c:111
#6 0x0aba62e9 in __tfork_thread () from /usr/lib/libc.so.65.0
The backtrace (bt) shows that it dies trying to do a mutex_unlock (I think). rtl_tcp also does a mutex_unlock and it doesn't crash. I'm probably reading it wrong for all I know. I don't know what's causing the signal 6 either.
I'd also like to get the -lrt of of the cmake files. OpenBSD doesn't use or have lrt, it works without. I can edit it out and compile, but every time I run cmake again, I have to edit the files again.
Alan
-----
Radio Astronomy - the ultimate DX
Some of you are probably using multiple dongles with alternating
applications as well.
Regarding the frequency correction, I put a sticker on each dongle and
wrote the ppm value on it.
However manually setting the value in each application is still annoying.
So I used rtl_eeprom to write the value into the product-ID with the
following format: "RTL%+dppm" resulting in sth. like this: RTL+87ppm
I'm not sure if this is the best way to do it, but this approach works
without changing librtlsdr.
Another option might be storing the value in a non-used eeprom
location and adding get/set functions to librtlsdr.
Is it possible to write arbitrary values to unused eeprom locations
without fucking up the realtek eeprom handling? Do you think this
would be a better way?
I'd like to hear your comments on this.
It would be awesome if we could find a solution that many existing
applications could incorporate.
(Of course the ppm value still changes with temperature, but having a
base value is still closer to the truth than 0ppm I'd say.
I measured the offset directly after grabbing samples for 20 minutes
at room-temperature and jitter is <1ppm.)
Best regards,
Hunz
Hi all,
One of the designers of the R820T tuner has recently been posting on the
ultra-cheap-sdr group, and has kindly supplied a map of the R820T registers
for private use only within the SDR community. He has asked that it not be
posted on any website, but only be forwarded around privately.
To that end, if anyone would like a copy of this spreadsheet and would be
willing to likewise not publish it on the web, let me know off-list and I'll
send it to you.
There's not a huge amount in the way of explanations, but it lists all the
registers and a very brief description of what they do.
Cheers,
Adam.
Hello
I have been fighting to compile the gr-osmosdr package in my osx
These are the minor fixes I had to do:
1- Change in Cmakelists.txt: find_package(Boost COMPONENTS thread system),
If thread and system is not included then there are linking problems.
Although I see that this was removed recently...
2- cmake -DCMAKE_INSTALL_PREFIX=/opt/local (so it is installed together
wit my ports intallation, this is really optional)
3- Before compiling I manually edit CmakeCache.txt, and change all the
references of /usr/lib/python to /opt/local/lib/python. If not it links
agains the system python instead of the ports's python!
4- Add this line in my .bashrc so python can find the package
export PYTHONPATH="/opt/local/lib/python2.7/site-packages/"
I am aware that this is not the most elegant way to solve the problem but
is the only one that I was able to come up. Hope it helps
Regards
I am also having the rtl_fm scanning bug which was first reported in
http://lists.gnumonks.org/pipermail/osmocom-sdr/2013-February/000485.html
The scanning mode in rtl_fm hangs during the re-tuning process. This
makes it impossible to use rtl_fm in scanning mode. I am using Ubuntu
12.10 with libusb 2:1.0.12-2, the latest git master version of rtl-sdr,
and an Elonics E4000 (Realtek, RTL2838UHIDIR).
Here are the steps to reproduce:
1. Find some frequency FREQ_NOISE that is not in use, and some frequency
FREQ_SIGNAL which is in use constantly (i.e., like a commercial FM
station). Find an appropriate squelch value SQUELCH which will stop
FREQ_NOISE but pass FREQ_SIGNAL.
2. rtl_fm -f ${FREQ_NOISE} -l ${SQUELCH}. The program should not pass
any audio, but it will exit cleanly with ^C.
3. rtl_fm -f ${FREQ_SIGNAL} -l ${SQUELCH}. The program should pass audio
and exit cleanly with ^C
4. rtl_fm -f ${FREQ_SIGNAL} -f ${FREQ_NOISE} -l 0. The scan will pause
on the first channel and pass audio.
5. rtl_fm -f ${FREQ_NOISE} -f ${FREQ_SIGNAL} -l ${SQUELCH}. The program
will not output any audio, even though the first frequency is squelched
out and the scanning function should skip immediately to the second
frequency. Additionally, it will not exit if sent a SIGINT with ^C, and
it must be killed with a SIGKILL.
I've made a stack trace of this behavior. The program hangs very
consistently at the same location each time:
http://pastebin.com/wzE09MCi
Thread 1 (the USB buffer reading thread) hangs while waiting for the
data_write mutex lock in rtl_fm.c:642. I presume the thread has
exhausted its supply of data and isn't getting any more.
Thread 2 is slightly more interesting, because it hangs during the
libusb system calls within the tuning function rtlsdr_set_center_freq()
in librtlsdr.c:796. My guess is that it's making a blocking call that
never returns.
Can anyone else reproduce this bug?
Hi All,
Can anyone point me in the direction of a place I can go to submit patches?
I fixed a bug in CMake for gnuradio-osmosdr that stopped the build on my system.
Best,
Mike
I'm using rtl_fm in a scanning (nbfm) situation, and even with the antenna input disconnected and terminated, I get a burst of noise that sounds like a squelch tail maybe once each scan cycle. They're a little over 1 second apart. I've tried setting the squelch up to 10000 where I don't get signals anymore, but I still get the noise.
I discovered that if you scan and have the output be a raw audio file, nothing gets recorded when nothing breaks squelch. This makes time-lapse recordings where all the pauses are taken out. Very nice, except for the noise bursts.
I'm using:
#!/bin/sh
rtl_fm -N -S -l 132 -p 102 -f 854040000 -f 854240000 -f 854315000 -f 854415000 -f \
854490000 -f 854540000 -f 855165000 -f 855240000 -f 858790000 -f 867350000 \
-f 867715000 /tmp/`date +"%Y-%m-%d_%H%M.raw"`
and recorded overnight, but I mostly got noise bursts.
Alan
-----
Radio Astronomy - the ultimate DX
Hello!
This is a minor issue, but may lead to some misunderstanding (and seach
failures) in future.
The package to correct IQ imbalance is registered at http://cgit.osmocom.org
as gr-iqbal but project name set in line 24 of toplevel CMakeList.txt is gr-
iqbalance.
Which name should be used while building packages, etc.?
BTW: Is there any homepage for gr-iqbal/gr-iqbalance?
Wojciech Kazubski
rtlsdr in async mode is returning buffers full with data from previous calls.
I noticed that sometimes the first async callbacks would return TOO
QUICKLY so that it would be impossible to have sampled that data. This
might be because lib-usb have previous transfers submitted and not
cleared?
I compiled rtl_test with a small modification that will print the
sampling time if it is much lower than expected.
BUF_LENGTH is (16 * 16384)=256K, and SAMPLE_RATE is 2048000, so a full
buffer should take about 64ms.
You can see that in this example the async callback is returned after
7.79ms (and the following one 1.47ms later). So the data must be old.
The ns printf
nuno@ubuntu:~/Desktop/rtl-sdr/build/src$ ./rtl_test -p
Found 1 device(s):
0: ezcap USB 2.0 DVB-T/DAB/FM dongle
Using device 0: ezcap USB 2.0 DVB-T/DAB/FM dongle
Found Rafael Micro R820T tuner
Supported gain values (29): 0.0 0.9 1.4 2.7 3.7 7.7 8.7 12.5 14.4 15.7
16.6 19.7 20.7 22.9 25.4 28.0 29.7 32.8 33.8 36.4 37.2 38.6 40.2 42.1
43.4 43.9 44.5 48.0 49.6
Reporting PPM error measurement every 10 seconds...
Press ^C after a few minutes.
Reading samples in async mode...
ERROR ns: 7791886
ERROR ns: 9259301
lost at least 185 bytes
real sample rate: 2073369
real sample rate: 2048190
^CSignal caught, exiting!
User cancel, exiting...
Cumulative PPM error: -289
The modifications to demonstrate the issue:
nuno@ubuntu:~/Desktop/rtl-sdr/src$ git diff
diff --git a/src/rtl_test.c b/src/rtl_test.c
index f5a56b8..1935bfb 100644
--- a/src/rtl_test.c
+++ b/src/rtl_test.c
@@ -138,9 +138,11 @@ static void rtlsdr_callback(unsigned char *buf, uint32_t le
ppm_now.tv_sec = tv.tv_sec;
ppm_now.tv_nsec = tv.tv_usec*1000;
#endif
- if (ppm_now.tv_sec - ppm_recent.tv_sec > PPM_DURATION) {
- ns = 1000000000L * (int64_t)(ppm_now.tv_sec - ppm_recent.tv_sec)
- ns += (int64_t)(ppm_now.tv_nsec - ppm_recent.tv_nsec);
+ ns = 1000000000L * (int64_t)(ppm_now.tv_sec - ppm_recent.tv_sec);
+ ns += (int64_t)(ppm_now.tv_nsec - ppm_recent.tv_nsec);
+ if(ns<60000000)
+ printf("ERROR ns: %9lld\n",ns);
+ if (ppm_now.tv_sec - ppm_recent.tv_sec > PPM_DURATION) {
printf("real sample rate: %i\n",
(int)((1000000000L * ppm_count / 2L) / ns));
#ifndef __APPLE__
Nuno
Hi
I have an Elonics 4000 USB receiver, and I-m trying to do some
measurements on the digital television received power in my area.
Everything works fine, except that the received power is never lower than
-46 dB, even on channel which are not occupied. I was expecting to get
values like -120 dB or something similar, instead in my measurements the
lowest point in any portion of the spectrum is circa -46 dB.
Am I missing something?
Thank you
Best
Hello,
I use rtl_sdr to collect some data into a bin file(file.bin).
As from the wiki, I think it is parsed like this :
/fd = open("file.bin","r") ;//
// char I[];//
// char Q[];//
////loop ://
// read(fd, I, 1) ;//
// read(fd, Q,1) ;//
// endloop//
// char *p = Q ;//
// loop ://
// printf("%f\n", (float*)p);//
// p += 4 ;//
////endloop/
But this code can only print 0.00000
I don't know where the error is.
Please help me.
Thank you.
Hi,
I believe through this mailaddress I can reach the persons that have written the excellent
wiki on the installation of the realtek receiver for it to receive ADS-B signals.
I am a vivid Linux 'customer' but not a programmer. Yet, using the
instructions I have been able to get a signal on my pc.
So thanks for a well written wiki!
Best regards,
Marco
Ps: the most difficult issue to tackle was to find an address for this 'thank you' note ;-)
List,
As a new member here, only recently having acquired a few SDR units, I
am not sure if this is the right place to post the following. If not,
feel free to advice me of a better way (for reference, I've tried to
mail a few of the key people here already directly, but I guess that was
not the correct way to approach it).
R820T tuning problems
==============
As far as I can tell, the "correct" way to tune a RTL SDR device is to
call rtlsdr_set_center_freq, which in turn invokes the set_freq function
of the driver (mapped to r820t_set_freq which further calls
r820t_SetRfFreqHz in case of the R820T).
This may not always result in the desired frequency being tuned to. In
some cases, such as with the current tuner_r820t.c driver, it can lead
to rather large "errors" *). For instance, trying to tune to 144.000 MHz
works as intended -- but tuning to anything in the range 144.002 to
144.058 will lead to center frequency of 144.030 MHz, or in other words
an error as large of 28 kHz. [If you apply a PPM value other than 0, the
exact range will be a little different.] I've also seen cases where use
of the "kalibrate" program to locate GSM channels has resulted in one
actual GSM channel being detected on 3 adjacent channels, 2 of which are
incorrect -- and the calculated frequency deviation is also often wrong
on some channels. For GSM channels and other frequencies above 885 MHz,
the tuning "error" can be as much +/- ~225 kHz.
Now, the way for an application to verify the resulting tuning would be
to call the rtlsdr_get_center_freq. However, in the current
implementation, this will return the requested tuning (adjusted with any
calculated offsets set in the librtlsdr.c driver -- which is not
applicable for the R820T) and not the actual tuning.
Further, I've checked a couple of GUI programs on how they use the rtl
library. One (linrad) only calls the rtlsdr_set_center_freq function and
newer checks the resulting tuning. Another (gqrx via the osmo library)
does a similar thing, but has a note "FIXME: Read back frequency?"
comment. Also, the kalibrate suite does not check for the actual tuned
frequency either. So all of these applications will report an incorrect
frequency of the spectrum, and the error depends on what center
frequency you may have requested.
So -- unless I have overlooked something, which could be entirely
possible -- the current R820T driver sometimes detunes the requested
center frequency and the applications using it are not checking for
this. And if they had wanted to do so, they have no means to with the
current codebase.
Proposed "fix"
========
I understand the R820T driver is messy anyway and that its origin has
caused it to be what it is.
The problem above is caused by a few lines of code in tuner_r820t.c
around line 1460, that does "spur prevention". The idea is that if the
resulting VCO frequency is too close to a harmonics of the xtal/2
frequency, the VCO is detuned to run exactly on top of the harmonics, to
reduce any spurs within a (450/N) kHz bandwith, where N is the MixDiv,
i.e. the divisor used to derive the center frequency from the VCO that
runs between 1.77 and 3.54 GHz.
This probably makes a lot of sense for applications where the exact
tuning is not too important. Likely when using the stick for DVB-T as
originally intended, the later stages will compensate for the detuning
and just benefit from the reduced amount of spurs.
But for our use, it might make less sense and I guess most of the other
drivers have not implemented something like this anyway?
I can think of the following ways to fix this issue:
1) Disable the code that does the detuning (either temporarily (#if
0/#endif) or by some setup parameter so that an application can
specifically ask for the "detune" behaviour). This will possibly
introduce additional spur on the R820T, in the presence of strong
"near-center frequency" signals, but will fix the problem for existing
SDR applications.
In case of an optional "setup" parameter, proper handling of the
detuning and reporting back to the calling application has to be made
available, and those applications then eventually need to implement some
way of handling it.
2) Merge with the possibly somewhat similar "offset" tuning method
implemented for some other devices -- although I'm not sure exactly how
that is intended to work and if it truly is compatible with the R820T
method.
3) Keep the code as is, but "tweak" the sampling in the 2832 device to
compensate for the offset tuning. So in other words, instead of assuming
the nominal IF of 3.57 MHz, tune to an offset IF. This will obviously
skew the BW and may not work for smaller sampling rates, where the
offset could actually be larger than what the sampling rate allows...
I would propose to simply disable the code for now (and I'm happy to
submit a small patch), as the detuning currently appears to introduce
more problems than it solves. A 2nd step would then be to add a proper
report-back abstraction/functionality to all drivers and make the
librtlsdr.c code fully aware of the detuning and allow it to be reported
back to the caller, and then ask the application developers to start
using the get_center_freq calls -- at least if they have asked for
"enable_spur_prevention" or something similar.
Thanks for any comments/advice on the above,
-- Per.
Hello!
Fo some time I am unable to build gr-osmosdr with enabled documentation.
...
cmake -DCMAKE_INSTALL_PREFIX=/usr -DENABLE_DOXYGEN=1 ../
make -j2
...
The build fails with the following error:
...
[ 151s] [ 26%] Built target doxygen_target
[ 151s] Scanning dependencies of target osmosdr_swig_swig_doc
[ 151s] [ 30%] Generating doxygen xml for osmosdr_swig_doc docs
[ 151s] /bin/sh: osmosdr_swig_doc_swig_docs/Doxyfile: Permission denied
[ 151s] make[2]: *** [swig/osmosdr_swig_doc_swig_docs/xml/index.xml] Error
126
[ 151s] make[1]: *** [swig/CMakeFiles/osmosdr_swig_swig_doc.dir/all] Error 2
[ 151s] make[1]: *** Waiting for unfinished jobs....
...
This happen after this commit:
http://cgit.osmocom.org/gr-osmosdr/commit/?id=dc28f6c4c874e182557c8cb9fe8a0…
(the last one from Feb 16 2013)
For me, it looks like an attempt to run Doxyfile directly and not process it
by doxygen.
Wojciech Kazubski
I have a stick with the R820T Tunner.
Using SDR# and SDR-Radio V2, and tunning to FRS and Airband
transmitters I determined that my offset is consistent to 54-63 PPM.
On the other hand kalibrate-rtl reports my offset to be about -30PPM,
also consistently accross several tests.
Is there possibly any difference from 100 (Airband), 400 (FRS) and
900ish (GSM) Mhz?
I am at Portugal, so the network is probably a GSM900 and also I don't
get something as NOAA narrow channels to figure this out.
GSM-900:
chan: 10 (937.0MHz + 27.986kHz) power: 29372.03
chan: 30 (941.0MHz + 26.850kHz) power: 197782.82
chan: 50 (945.0MHz + 26.645kHz) power: 86521.18
chan: 62 (947.4MHz + 27.109kHz) power: 29501.15
chan: 74 (949.8MHz + 26.605kHz) power: 56078.03
chan: 107 (956.4MHz + 26.442kHz) power: 57536.25
chan: 112 (957.4MHz + 25.319kHz) power: 31011.95
chan: 121 (959.2MHz + 25.145kHz) power: 87002.34
E-GSM-900:
chan: 10 (937.0MHz + 27.514kHz) power: 27631.40
chan: 30 (941.0MHz + 26.395kHz) power: 211497.68
chan: 50 (945.0MHz + 26.209kHz) power: 88871.46
chan: 62 (947.4MHz + 26.629kHz) power: 27770.99
chan: 74 (949.8MHz + 26.181kHz) power: 56274.60
chan: 107 (956.4MHz + 25.984kHz) power: 47437.43
chan: 112 (957.4MHz + 24.878kHz) power: 25791.29
chan: 121 (959.2MHz + 24.765kHz) power: 98473.19
What can be going on here?
Also, what is the reason for the dropped samples when sampling over
2.8MS/s? Is it a RTL2832 issue or something related to the USB driver?
Regards,
Nuno