SEO - Penguin, negative SEO & Google’s disavow bad link removal tool

19/10/2012 4:12:12 PM - Posted by Paul S - SEO

Within the last two years Google has made a lot of drastic changes to its search engine algorithm.
Evolving how results are displayed, what is displayed and more importantly what isn’t displayed.

These updates have gone under the banner of Panda and Penguin duo forming what some call the Google Zoo.

While Panda focuses on website content issues, Penguin takes on low quality links pointing to your site and categorised as spam.

The issue here is that many are not sure whether they have been impacted by Panda onsite or Penguin offsite issues. This can be quite dangerous as the wrong analysis and actions will inevitably cause you more grief.

With Further ado

The game changer here is that Google previously ignored spam type links and simply didn’t provide you any benefits from them. However the Penguin update initially released on the 24th April 2012 marked the introduction of negative SEO.

This was a mixture of humans hired by Google to identify bad link sources along with the first automated version being rolled out. The outcome was endless amounts of sites either dropped or vanished from Google search results followed by a great deal of controversy.

Many coming to the realisation that just because they may not have been aware that links purchased were bad/spammy, didn’t make them exempt from being penalised online. In some cases this may have been caused by footer links or blog posts.

What now?

To compound the situation Google continues to roll out Panda and Penguin updates independently. This has made it difficult for website owners and marketers experiencing issues to identify the exact cause and actions required to correct this.

Many site owners would have received notifications from Google Webmaster tools stating their site was using techniques outside of Google’s webmaster guidelines focused on manipulating search results. Unfortunately this kind of generic notification didn't assist in isolating the cause, leaving the door open to either or both of Google’s Zoo duo.

For more information read our Google Panda and Penguin updates explained with tips article.

At criticone, we have seen cases where websites have been affected by Panda for duplicate content and by Penguin for bad links. This has been to varied degrees further amplifying why it’s important to understand the cause prior to acting.

Another thing to keep in mind is it takes time for Google to update removed links and duplicate content triggers so don’t be surprised if it takes you months to recover. In addition recovery doesn't necessarily mean you will end up where you used to show up in Google.

Identifying the cause

So let’s say you've been able to correct any onsite issues, however need to further review external links that point to your website.

Your first stop should be Google Webmaster tools. Once you have created an account and verified you own the site, via >domain name >traffic >links to your site, you will be able to view and export to excel a decent extract of;
  • sites and pages linking to your website
  • breakdown of keywords used in the links to your site
Although these numbers won’t be exact, it will provide you with a good insight. You can then visit the pages and quickly evaluate if;
  • you have links from spammy looking pages
  • have too many links from within individual websites

In addition reviewing the variety of keywords used to link to your site should provide clues to what looks natural or spammy.
If you had 1,000 independent people link to your site would they all use the exact 10 phrases and spelling? This is called exact match and should be avoided ensuring a good natural looking mixture of themed keyword phrases.

Beyond this if you need to identify the quality of specific pages that link to you, do not look at the page rank (PR). You should be assessing inbound links by;
  1. subject theme and relevancy to your topic-keywords-industry
  2. page quality
  3. overall website substance and quality
  4. more than 10 outbound links from a given page should be questioned
If pages look fabricated with little human use or attraction it’s likely their purpose is to cheat the Google system. Given Google is constantly evolving; your best action would be to request the link be removed.

Make sure you review the page along with linking keyword as the issue might be too many exact match keywords and not the page the link is coming from itself.

I'm confused what is a link? 

A link is clickable text or image from a third party website, that is a website you dont control. Search Engines such as Google use links as indicators on how popular your website is for a given phrase.
You will find links in;

  • blog comments
  • legitimate or fake blog articles
  • website footer links
  • directories
  • ezine type sites

Why request removal of links you paid for?

SEO is evolving and becoming part of a bigger online marketing picture. Just like traditional marketing leaves a brand footprint, SEO is a long term initiative that leaves a technical footprint.

No marketing is perfect, however any big mistakes should be cleaned up reducing current and future impact to your online and subsequent bricks and mortar business.

Your goal should be to remove as many bad links on the internet pointing to your site as possible.
Today Google is indexing these links causing you problems; tomorrow it could be another search engine or social media site. Ultimately we need to clean up our backyard which means having a clean online profile. Any SEO work performed should be invisible, meaning it should look natural.
15 links from one site, or links from the exact same 5 keyword phrases doesn’t present a natural looking profile.

We have successfully had links removed for ourselves and clients by tracking down website owners using
In addition if you are seeking articles you created and submitted removed, from experience DMCA takedown should provide you with a more compelling removal request.

Great news as Google has now released the disavow links tool right?

As Google also states, using the disavow tool should be a last resort and only used if you’re having issues removing a small number of remaining bad links.
It’s vital that you be 200% sure you have identified bad links and that they are impacting your site before submitting them via the disavow webmaster tool.

As a warning; a bad link is a combination of the page and keyword when looking at your whole link profile. If the page doesn't look spammy, changing the keyword could rectify the issue by creating a more diverse link/keyword profile.

Reporting a page via disavow could lose you a good link and impact others with links on that page with a more natural looking overall link and keyword profile.

google disavow tool
Removing good links can impact where you site comes up in Google. Trying to reverse removal requests will take much longer to process with unknown outcomes.

Google’s best spam reporting tool

The word on forums is that the disavow tools is another stroke of Genius from Google.
Apart from letting you report what you think are spammy link sites, it allows all webmasters to do the same.

So why is this bad?

Given that links are still one of the strongest influencers of where a website sits in Google’s search results, the biggest challenge has been scoring the quality of link sources and knowing what to discard.

Using the disavow tool, Google in essence has recruited all of us to feed its search engine algorithm by telling it what sites are good or bad and which ones have paid links on them!

While many predict this will be a powerful signal used by Google to influence website positioning in search results, it’s not hard to see why this could be bad for SEO.

Get your social on