Pushing Bad Data- Googles Latest Black Eye


구글상위노출

Google stopped counting, or at least publicly displaying, the number of pages it indexed in September of 05, after a school-yard “measuring contest” with rival Yahoo. That count topped out around 8 billion pages before it was removed from the homepage. News broke recently through various SEO forums that Google had suddenly, over the past few weeks, added another few billion pages to the index. This might sound like a reason for celebration, but this “accomplishment” would not reflect well on the search engine that achieved it.


What had people buzzing was the nature of the fresh, new few billion pages. They were blatant spam- containing Pay-Per-Click (PPC) ads, scraped content, and they were, in many cases, showing up well in the search results. They pushed out far older, more established sites in doing so. A Google representative responded via forums to the issue by calling it a “bad data push,” something that met with various groans throughout the SEO community.


How did someone manage to dupe Google into indexing so many pages of spam in such a short period of time? I’ll provide a high level overview of the process, but...

Leave a Reply

Your email address will not be published. Required fields are marked *