I'm using the latest fosphor from git
(7b6b9961bc2d9b84daeb42a5c8f8aeba293d207c) and am seeing two weird (and I
believe related) issues. Firstly, I see the following error:
[+] Selected device: TITAN X (Pascal)
[!] CL Error (-30, /home/user/gr-fosphor/lib/fosphor/cl.c:409): Unable to
queue clear of spectrum buffer
This is CL_INVALID_VALUE returning from clEnqueueFillBuffer, so I added
some debug fprints to cl.c to see what parameters were being passed
into clEnqueueFillBuffer:
Edits:
fprintf(stderr, "size = %d\n", 2 * 2 * sizeof(cl_float) * FOSPHOR_FFT_LEN);
fprintf(stderr, "pattern_size = %d\n", sizeof(float));
fprintf(stderr, "pattern = %p\n", &noise_floor);
fprintf(stderr, "offset = %d\n", 0);
Output:
size = 16384
pattern_size = 4
pattern = 0x7fb66b7fdd2c
offset = 0
These parameters look like they shouldn't cause CL_INVALID_VALUE:
https://www.khronos.org/registry/OpenCL/sdk/2.0/docs/man/xhtml/clEnqueueFil…
But there is this one condition that might be met, which is that somehow
size (16384) is larger than the underlying buffer (cl->mem_spectrum). The
underlying OpenGL buffer being too small brings me to my next (I believe
related) issue. The spectrum plot in fosphor is weirdly pixelated, please
see the attachment, which shows a screencap from "osmocom_fft -F -f 100e6
-g 20 -s 10e6".
Where is the cl->mem_spectrum buffer ultimately declared / initialized? My
OpenCL / OpenGL sharing knowledge is nonexistent, any pointers for how I
can help debug this issue?
Output of clinfo is attached as well, and I'm on Ubuntu 16.04 on x86_64.
--
Raj Bhattacharjea, PhD
Georgia Tech Research Institute
Information and Communications Laboratory
http://www.prism.gatech.edu/~rb288/
404.407.6622
Hello,
it's my first mail on this list, so please forgive me if I do something
wrong.
I'm about to post a couple of patches for RTL drivers:
1. RTL-SDR: convert _lut to float[] to reduce size by a factor of 256.
The _lut is being indexed by I + Q (16 bits = 65536 entries), however
both samples can be processed independently, resulting in 8-bit LUT.
Saves a bit of RAM and CPU cache.
lib/rtl/rtl_source_c.cc | 19 ++++++-------------
lib/rtl/rtl_source_c.h | 4 ++--
2 files changed, 8 insertions(+), 15 deletions(-)
2. RTL-TCP: Convert to single class model
The existing RTL TCP driver is quite different from its brother RTL_SDR.
It's much more complicated, uses gr::blocks::deinterleave and
gr::blocks::float_to_complex, and generally doesn't work correctly
(e.g. https://github.com/csete/gqrx/issues/99
Spectrum is mirrored when filter or demodulator changes (rtl_tcp) #99)
I've converted the RTL TCP driver to the model used by RTL_SDR,
simplifying it in the process, and fixing the GQRX issue.
lib/rtl_tcp/CMakeLists.txt | 1 -
lib/rtl_tcp/rtl_tcp_source_c.cc | 352 ++++++++++++++++++++++++++++++++--------
lib/rtl_tcp/rtl_tcp_source_c.h | 32 +++-
lib/rtl_tcp/rtl_tcp_source_f.cc | 327 -------------------------------------
lib/rtl_tcp/rtl_tcp_source_f.h | 125 --------------
5 files changed, 309 insertions(+), 528 deletions(-)
I'm also thinking about merging the code common to RTL-SDR and RTL-TCP,
but this it's done yet.
Comments?
--
Krzysztof Halasa
Hello,
I use RTL-FM on a raspberry pi3. I notice practically always the
presence of a weak carrier accompanying the emission received in AM
mode and interferences in NFM mode. When i use rtl-tcp and sdr# or
sdrconsole the demodulation is perfect. I assume that IQ amplitude
phase correction is involved. I saw in the source of rtlsdr:
IQ estimation / compensation (en_iq_comp, en_iq_est) * /
Rtlsdr_demod_write_reg (dev, 1, 0xb1, 0x1b, 1);
I suppose we use an internal chipset function? Is this function good
enough? I did not find anything about it in RTL-FM but I'm not a
specialist. Is an improvement possible?
Regards,
Arnaud
Hi,
I get a floating point exception when I run rtl_tcp. I traced the error
with GDB to line 515 of tuner_r82xx.c
if (vco_fra > (2 * pll_ref_khz / n_sdm))
I found that n_sdm was 0.
I don't understand how this code work and I'm very new to this software
but it sure looks like a programming bug.
| 513 /* sdm calculator */||
|| 514 while (vco_fra > 1) {||
|| 515 if (vco_fra > (2 * pll_ref_khz / n_sdm)) {||
|| 516 sdm = sdm + 32768 / (n_sdm / 2);||
|| 517 vco_fra = vco_fra - 2 * pll_ref_khz /
n_sdm;||
|| 518 if (n_sdm >= 0x8000)||
|| 519 break;||
|| 520 }||
|| 521 n_sdm <<= 1;||
|| 522 }|
If the condition on line 515 ever evaluates to false then vco_fra
doesn't get updated. The loop will keep repeating with the same value
of vco_fra until n_sdm becomes 0.
Steve
Hi All,
I would like to realize an algorithm to build the best possible dvb-t
sdr experience. The idea is, that no matter how many sdrs and what kind
of antenna you have,
the software gets the best out of your hardware.
This is what I have thought:
The user community of a system depends on how useful it is,
how easy it is to use, build and how much money it costs.
Rtl-sdrs are quite cheap, but for now a user has no benefit of having
multiple sdrs in its system.
That is why I'm searching a way to correlate the signals of the sdrs
without hardware modification. I think everyone of you has seen the noise from
the power source. Has anyone tried to build a filter to use that noise
to calculate the delay between multiple dongles?
I mean, they get the same noise. I don't know if it is sufficient,
but it is also a good spot, to put some artificial noise in, as it can be easily
accessed and does not interfere with rf circuitry when it is switched off.
Is that an useful approach?
Do you think this could work? Did I miss something?
regards,
Steve
Hello
We are doing a ADS-B reciever senior project,
Can we have the building blocks in the gnu radio for that?
Please help us
We are using the hackrf one hardware
Which is the osmocom source in the gnu radio
Thanks
Sent from my iPhone
Hello,
I recently purchased a NooElec nesdr mini dongle. I want to install it on
Linux Mint 17.1. I have rtl-sdr installed. I have blacklisted the
dvb_usb_rtl28xxu driver. When I run rtl_test it reports that the dongle is
present and sending. When I run rtl_sdr -t to check the tuning range it
reports no E4000 tuner found.
Called NooElec for tech support. Got link for linux install but got error
"page not found". Tried to get on their mailing list from their IE page. Bad
gateway error 502. Looks like support may be difficult to get through
NooElec.
Does this error indicate a defective rtl dongle? Where can I find the
install procedure for the NooElec R820T NESDR mini. All responses are
appreciated.
Thank you,
Marty, N3MOW
N3mow(a)arrl.net <mailto:N3mow@arrl.net> mmcg29440(a)frontier.net
<mailto:mmcg29440@frontier.net>
So since there are now three radios (that I know) that has a hardware
Bias T support (HackRF, RTLSDR, Airspy) I added a set_biast and
get_biast methods to the default Source API. I also added Airspy and
HackRF bindings to enable / disable Bias T. The RTLSDR isn't merged into
mainstream, so I didn't added anything.
There is also the default implementations (doing nothing for set, and
returning false for get), so no breaks on the current source or current
applications.
--
Lucas Teske
Teske Virtual System
GPG: 4A90 974B ACE0 A9A6 AF09 B3B1 6C39 C1C1 6A9D A7BE
I have a Panadapter program, written in python, that throws an error
"undefined symbol Set_Tuner_Bandwith in librtlsdr0" Searching the net I've
found that set_tuner_Baandwith may have been dropped from the library. I'm
using version 5.3. One web site said that it had been added back in the
latest revision v0.7.0. Tried loading the new library and now find that
RTL-SDR requires dependencies found in librtlsdr0 version 5.3. Is there a
work around to eliminate this error. I did download the latest rtl-sdr but
it will not compile because of the dependency problem
I did not write the program so I hesitate to take it apart and I have not
received much help from the author. BITW I'm not the greatest linux person
so I need clear answers. Having said that all responses will be appreciated.
73
Marty, N3MOW
I have been playing with the rtl_fm program and most of
it works amazingly well but I can not seem to get the squelch to
stay open when receiving signals.
I have not tried to analyze the source yet so I am asking
what principle drives this squelch?
In the analog world, the best squelches for FM receivers
tend to be noise-driven and use a high-pass filter to filter out
normal audio. When the noise drops below a preset threshold, the
squelch opens.
For AM receivers, a cheap and easy solution is to monitor
the AGC and open the squelch when there is AGC voltage above a
preset level.
What I am noticing is that if I set the l value to a
point just above where the noise stops, signals do open the
squelch but even strong signals will not keep it from flickering
on and off constantly.
If I set the l value any lower, the squelch is always
open so that is not the issue.
I have tried signals that are absolutely full-quieting,
with and without CTCSS and the squelch opens briefly, closes for
a fraction of a second, opens for another fraction of a second
and randomly flickers on and off for the whole transmission.
On rare occasions, the squelch opens when the signal
starts, stays open and then closes properly after the carrier
leaves.
Some of these signals are even ever so slightly noisy and I have
heard this situation with and without PL tones or CTCSS so that
doesn't seem to matter.
Finally, I thought it might have something to do with too
narrow a bandwidth so I increased the sampling rate to 24 K which
made no difference at all.
If the signal has voice on it, the flickering doesn't
seem to be effected by the words.
Basically, what is this squelch responding to to keep flapping?
Martin McCormick WB5AGZ