I've had problems with Yandex (and Baidu) in the past, they both hit harder than google. Given that I had 0% interest in the Russian/Chinese market where I was, I blocked them. I tried using robots.txt, but they didn't play well.
You might find this[^] discussion helpful I went down the URLRewrite route, it worked pertty well, but needed IIS7 plus the module installed.
Thanks for the article, I need to know that stuff anyway. Google is actually pretty courteous crawling at a rate of about 2 minutes each.
I think my strategy is wrong here. Perhaps for crawlers and bots, I should make them wait for the whole process to complete on a single thread. But on human users, I can put data writes on background threads so there experience is faster.
Last Visit: 31-Dec-99 19:00 Last Update: 21-Nov-14 1:35