I've ported my app to use rtl_tcp.exe instead of the native rtlsdr libs,
and while it is partly working (samples reading correctly, no dropped
samples, etc.), I have run into two issues.
First--when I start reading samples too quickly after setting the sample
rate, I get this error:
C:\Users\Scott
Cutler\Projects\p4\0\projects\sdr\SeeDeR\SeeDeR\rtl-sdr-release\x32>rtl_tcp.exe
-f 100000000 -s 2048000
Found 1 device(s).
Found Elonics E4000 tuner
Using Generic RTL2832U (e.g. hama nano)
Tuned to 100000000 Hz.
listening...
Use the device argument 'rtl_tcp=127.0.0.1:1234' in OsmoSDR (gr-osmosdr)
source
to receive samples in GRC and control rtl_tcp parameters (frequency,
gain, ...).
client accepted!
set freq 105000000
set sample rate 1920000
worker cond timeout
Signal caught, exiting!
comm recv socket error
Signal caught, exiting!
all threads dead..
listening...
Use the device argument 'rtl_tcp=127.0.0.1:1234' in OsmoSDR (gr-osmosdr)
source
to receive samples in GRC and control rtl_tcp parameters (frequency,
gain, ...).
If I instead wait about two seconds after setting the sample rate, there
is no problem. There is also no problem if I immediately read samples
after connecting, where I set the sample rate on the command line.
The other issue is that tuning is very slow--about one second per call
to rtlsdr_set_center_freq. I had a similar problem with the rtlsdr.dll
library, and I found that tuning was very slow if performed from the
main thread, but fairly quick (maybe 10-20 hz) if done from the async
proc thread that rtlsdr_read_async launches.
That seemed a bit odd to me but it worked. I suspect that rtl_tcp.exe
is doing the same thing; calling rtlsdr_set_center_freq from the main
thread.
Ideas, anyone? Thanks!
-Scott
PS: I'm running Win7-64, in case it matters. Also using the latest
version here:
http://sdr.osmocom.org/trac/attachment/wiki/rtl-sdr/RelWithDebInfo.zip