Hi,
On 20.01.2013 11:48, Holger Hans Peter Freyther wrote:
this is due multiple web spiders (Bing, Baidu, ...)
hitting the trac
and cgit. Patches for the robots.txt are welcome.
I guess setting up Varnish would make more sense than blocking spiders,
after all we want the pages to be found by search engines...
Regards,
Steve