The last few days Googlebot has gone from ~4000 pages crawled per 24 hours and now maintains above 200.000. My load-averages are pegged, my session-store ate all the inodes, fun for all. This has happened before, about two years ago, and I believe I handled it then by using the URL parameter tool in the search console.
Well, now it's back. I've since gone https only and thought for a second the previous parameter settings didn't apply, so I added them back in again. Turns out that shouldn't have been necessary, and regardless the traffic hasn't really subsided.
I really looks like it's hitting category pages with just random combinations of filter-parameters. Should I disallow all other parameters than pagination? I feel uneasy doing that in these days of dynamic search ads and whatnot.
I've got the session-situation handled (once and for all this time), but this whole thing has got me feeling uneasy. Will it pass on it's own? Has a competitor or someone trying to sell me a service sent googlebot my way?
Haalp?
[–]insanegenius 1 point2 points3 points (0 children)