Google Plans Penalty for Over-Optimized Sites

For years search engine optimization (SEO) experts and other Web professionals have scoured Google updates in an effort to craft content that will put Web pages at the top of search engine results pages (SERPs), but now Google is set to penalize SEO perfection.

The plan to penalize the ranking results for “over-optimized” websites is the search giant’s attempt to impart more fairness in search results by putting websites with good content and bad optimization on par with those that simply have good optimization. The move was announced in early March during a conference at South By Southwest in Austin, Texas, and the rollout is expected to take place within the year.

As usual, Google is remaining tight-lipped over the precise details of the penalty and the exact date of the rollout, presumably to catch over-optimizers unawares. The penalty would appear to be one of Google’s biggest Catch-22s yet, however, because if experts creating legitimate content knew the definition of over-optimization they could work to avoid it, but then that would take the teeth out of the penalty as it applies to unscrupulous SEO operators.

Although an explicit definition will not be forthcoming, Google reps say the goals is to penalise abusers who stuff keywords into the text, use too many links or do anything that would fall outside what a “normal” person would expect to see in the content. The claim is that the goal is to make GoogleBot smarter, but the question then is who is defining “normal.”

Of course, many experts are speculating on what will be penalized and how to avoid being impacted. Highly optimized anchor text, repeated linking from body copy to a handful of key pages and domain names with generic keywords are expected to be targeted for penalization, although there is dispute in even these seemingly obvious predictions.

Some SEO experts believe the change could significantly disrupt their business, or even kill it. For those who share the attitude that the change is unfair – that genuinely good content should not be able to compete with the hours of effort it takes to optimize a page – it may be time for them to start consider investing some of those hours in developing relevant content rather than massaging fluff.

The change may force Web experts to reexamine their approach to content, but this time around it could result in the development of superior Web information instead of encouraging new ways to “beat the system,” and that might translate into a big victory for Google users.

Dean Saliba

Dean Saliba is a freelance writer, professional blogger, media enthusiast, dirty football player and huge professional wrestling fan who covers a wide range of subjects and niches including, making money online, traffic generating, pro wrestling, blog reviews, football, how-to guides, music, internet marketing and more.


  • Google have definitely been cracking the whip lately on Black Hat SEO tactics but it should be good for legitimate content providers on the long run and certainly the userbase at large, which includes all of us really.


    • My thoughts exactly, if you are trying to run a legitimate site(s) then it will mean the spammy and dodgy sites will be pushed down to the bottom or even removed entirely. 🙂

Comments are closed.