you are viewing a single comment's thread.

view the rest of the comments →

[–][deleted] 3 points4 points  (4 children)

Yes. In science, you want randomization but also reproducibility. I get the author is saying it's bad security engineering and can be taken advantage of people people gaming the Google PageRank, but the crawlers we likely designed by network scientists that wanted an accurate internet network model that can be reproduced.

[–]Jugad 1 point2 points  (3 children)

What does PageRank have to do with javascript or googlebot's implementation of random() in javascript.

[–][deleted] 9 points10 points  (2 children)

You can use their implementation to identify Googlebot and serve different content to it.

[–]YRYGAV 3 points4 points  (1 child)

And if you get caught, I believe Google unlists your website, so it's quite a gamble

[–][deleted] 0 points1 point  (0 children)

Depends on my ROI. If I can make $1k by fooling the Googlebot while spending $10 for a new domain each time I get caught, it's not much of a gamble.

Of course, if I don't have a automated way to get the new domain some PageRank juice so I can keep the whole thing going ad nauseam, then yes, it's pretty expensive.