Yesterday our servers were being hammered from one IP address: which appears to be the Yahoo indexer. Then today I read a tweet from Ben Nadel he was having the same thing happen. So I thought I would write a quick post in case others were having the same issue.

On the Yahoo! SLURP page they suggest adding the following information to your robots.txt file.


The number is the seconds of delay before it sends another request. This seemed to stop the problem unless SLURP just gave up. In doing more research there appears to also be an entry you can make, but nothing supports it right now.

Visit-time: 1800-2330

I added it just in case.

Related Posts | internet |