Hi!
The last days I experienced the website is sometimes very slow. Requests takes up to 30 seconds to be answered by the server.
http://bb.osmocom.org/ is fast, but http://bb.osmocom.org/trac/ is slow. Thus I suspect a Trac or database problem.
regards Klaus
On Sun, Jan 20, 2013 at 11:04:40AM +0100, Klaus Darilion wrote:
Hi,
The last days I experienced the website is sometimes very slow. Requests takes up to 30 seconds to be answered by the server.
http://bb.osmocom.org/ is fast, but http://bb.osmocom.org/trac/ is slow. Thus I suspect a Trac or database problem.
this is due multiple web spiders (Bing, Baidu, ...) hitting the trac and cgit. Patches for the robots.txt are welcome.
holger
Hi,
this is due multiple web spiders (Bing, Baidu, ...) hitting the trac and cgit. Patches for the robots.txt are welcome.
Put this in git.osmocom.org/robots.txt (currently it returns 403)
--- User-agent: * Disallow: / ---
The git should just not be crawled at all IMHO.
Cheers,
Sylvain
Hi,
On 20.01.2013 11:48, Holger Hans Peter Freyther wrote:
this is due multiple web spiders (Bing, Baidu, ...) hitting the trac and cgit. Patches for the robots.txt are welcome.
I guess setting up Varnish would make more sense than blocking spiders, after all we want the pages to be found by search engines...
Regards, Steve
I guess setting up Varnish would make more sense than blocking spiders, after all we want the pages to be found by search engines...
Well I think the problem is mostly when they try to index pages that are very heavy to generate (like search or diff between revisions ...), not when they browse standard wiki pages.
Cheers,
Sylvain
Holger Hans Peter Freyther wrote:
Patches for the robots.txt are welcome.
If not already done then I recommend moving all Trac databases off the toy SQLite database to MariaDB, Postgres or such.
Getting rid of SQLite has significant impact on popular Trac instances, and makes running the database on a separate machine really easy, which further distributes the load.
//Peter
Hi Peter,
On Mon, Jan 21, 2013 at 05:01:27AM +0100, Peter Stuge wrote:
Holger Hans Peter Freyther wrote:
Patches for the robots.txt are welcome.
If not already done then I recommend moving all Trac databases off the toy SQLite database to MariaDB, Postgres or such.
Getting rid of SQLite has significant impact on popular Trac instances, and makes running the database on a separate machine really easy, which further distributes the load.
We can do that, but I think the main issue is that the machine is just overloaded with too many VMs, so high load on one of the VMs will have a significant impact on others. Doing an upgrade / re-organization is on my todo list, but it is a very low priority item, as the slow web site is mostly a problem of convenience, and not an urgent issue at this point.
Regards, Harald
On Tue, Jan 22, 2013 at 4:56 PM, Peter Stuge peter@stuge.se wrote:
Harald Welte wrote:
the slow web site is mostly a problem of convenience, and not an urgent issue at this point.
It *is* the public view of the osmocom projects. If the web site looks down, the projects look down. (Even though it's not the case.)
I also agree that web-site is an important part of the project. Yeah, it's not critical, and one or two outages are fine, but it quickly becomes annoying if a web-site is slow/unavailable on a regular basis.
-- Regards, Alexander Chemeris. CEO, Fairwaves LLC / ООО УмРадио http://fairwaves.ru
Hi,
On 22.01.2013 13:56, Peter Stuge wrote:
It *is* the public view of the osmocom projects. If the web site looks down, the projects look down. (Even though it's not the case.)
I agree, especially since sometimes the site doesn't load at all. Browsing any of the osmocom-tracs at the moment feels like browsing via GPRS.
Regards, Steve
Hi, * Harald Welte laforge@gnumonks.org [2013-01-21 21:45]:
On Mon, Jan 21, 2013 at 05:01:27AM +0100, Peter Stuge wrote:
Holger Hans Peter Freyther wrote:
Patches for the robots.txt are welcome.
If not already done then I recommend moving all Trac databases off the toy SQLite database to MariaDB, Postgres or such.
Getting rid of SQLite has significant impact on popular Trac instances, and makes running the database on a separate machine really easy, which further distributes the load.
We can do that, but I think the main issue is that the machine is just overloaded with too many VMs, so high load on one of the VMs will have a significant impact on others. Doing an upgrade / re-organization is on my todo list, but it is a very low priority item, as the slow web site is mostly a problem of convenience, and not an urgent issue at this point.
I agree with Peter here and think this is not only about convenience. As most of the active developers rarely use the website I would assume, this is mostly about our public face to the community and people who are interested in the project and I think this is important. I would argue that this isn't a super high priority item, but given the time these problems exist, I think we should start working on that.
I'm also willing to give a helping hand here if needed.
Cheers Nico
Hi,
this is due multiple web spiders (Bing, Baidu, ...) hitting the trac and cgit. Patches for the robots.txt are welcome.
btw, could you send me the log file for such case ? I would guess they hit some high-resource pages that could be easily excluded by the robots.txt but that was just missed in the 'deny' ...
Cheers,
Sylvain
baseband-devel@lists.osmocom.org