92
Excerpt from a message I just posted in a #diaspora team internal f...
pod.geraspora.deExcerpt from a message I just posted in a #diaspora team internal forum category. The context here is that I recently get pinged by slowness/load spikes on the diaspora* project web infrastructure (Discourse, Wiki, the project website, ...), and looking at the traffic logs makes me impressively angry.
In the last 60 days, the diaspora* web assets received 11.3 million requests. That equals to 2.19 req/s - which honestly isn't that much. I mean, it's more than your average personal blog, but nothing that my infrastructure shouldn't be able to handle.
However, here's what's grinding my fucking gears. Looking at the top user agent statistics, there are the leaders:
2.78 million requests - or 24.6% of all traffic - is coming from Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; GPTBot/1.2; +https://openai.com/gptbot).
1.69 million reuqests - 14.9% - Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/600.2.5 (KHTML, like Gecko) Version/8.0.2 Safari/600.2.5 (Amazonb...
jwz gave the game away, so i’ll reveal:
the One Weird Trick for this week is that the bots pretend to be an old version of Chrome. So you can block on useragent
so I blocked old Chrome from hitting the expensive mediawiki call on rationalwiki and took our load average from 35 (unusable) to 0.8 (schweeet)
caution! this also blocks the archive sites, which pretend to be old chrome. I refined it to only block the expensive query on mediawiki, vary as appropriate.
nginx code:
# block some bot UAs for complex requests # nginx doesn't do nested if, so we set a test variable # if $BOT is both Complex and Old, block as bot set $BOT ""; if ($uri ~* (/w/index.php)) { set $BOT "C"; } if ($http_user_agent ~* (Chrome/[2-9])) { set $BOT "${BOT}O";} if ($http_user_agent ~* (Chrome/1[012])) { set $BOT "${BOT}O";} if ($http_user_agent ~* (Firefox/3)) { set $BOT "${BOT}O";} if ($http_user_agent ~* (MSIE)) { set $BOT "${BOT}O";} if ($BOT = "CO") { return 503;}
you always return “503” not “403”, because 403 says “fuck off” but the scrapers are used to seeing 503 from servers they’ve flattened.
I give this trick at least another week.