Don’t Let Your Site Get Deranked by Google Algorithm Updates

By Hayden Hsu | September 10, 2012 comments 1 Comment

Don’t Let Your Site Get Deranked by Google Algorithm Updates

Without going into details on how Google ranks our sites mathematically, let’s quickly skim over the surface and dive into how we can future-proof our sites against their new algorithm updates.

To understand how Google ranks your site visually, here’s a diagram drawn with our iD Workflow:

PageRank Explained with iD Workflow

Even though Site C has more links to it than Site B, Site B has a higher ranking because the link to B comes from a highly ranked site, Site A. This increases the chances of Googlers landing on Site B because it would rank higher on their search engine results page (SERP).

Tip: Your ranking decreases when you increase the amount of outbound links, so make sure if you do create a link that you do not wish to count towards your ranking, use a ‘nofollow’ link.

If your site has a high ranking, good on ya, mate! However, do monitor your site frequently to ensure that it doesn’t get penalised by new algorithm updates, which get tweaked almost every week, sometimes multiple times a day… Crikey!

In order to serve better search results for Googlers, Google needs to filter irrelevant and ‘poor quality’ sites, and in many occasions, sites that abuse Google algorithms to get higher rankings.

Google Panda

In February, 2011, Google released an algorithm update designed by their engineer, Navneet Panda. They named this update Google Panda… not just because of his cuddly name, but also because fighting machine Panda can destroy sites practicing ‘black’ arts, and reward those practicing ‘white’ arts.

A Good and Bad Panda

An illustration that depicts a good and bad panda.

White Hat Search Engine Optimisation (White Hat SEO) = the totally legitimate way to create your site.
Black Hat SEO = the way Google would penalise you for if busted.
There’s actually a Grey Hat, too, which is somewhere in the middle, but still Panda-slappable.

Google’s not going to release their behind-the-screen algorithm secrets, but they have released a list of guidelines that will prevent your site from getting Panda-slapped. We have filtered the full list down to three important points for you below. The full list is beyond this nofollow-link, but do come back; else, we would miss you.

  • Does your site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
    People tend to miss this one because it also includes boilerplate content – e.g., that extra information you repeatedly insert in your right column.
  • Does your site have an excessive amount of ads that distract from or interfere with the main content?
    Google has later reinforced this crackdown in their Page Layout Algorithm update, where they targeted sites with too many ads and too little content above the fold. Ironically, they have also lifted the number of ads that you can place above the fold in Google AdSense.

Above the Fold Ads on Google

  • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
    Google can measure ‘human’ factors, like user experience, design, and trust, of your site by using their Google Toolbar data. They know the time people have spent on your site from a SERP, your site’s bounce rate, and return visits. “Thin sites” have also become one of the main targets in a later algorithm update, called Google Penguin.

Tip: If you have “thin” pages, improve their content. You can also use robots tag to noindex them so search engine spiders won’t index them.

Google Penguin

In April, 2012, Google released another black and white friendly animal, a penguin this time, to vulture on the rest of the ‘black’ sites. This time, there was no Mr Penguin (awe), just another deceptively-looking animal to carry on the black and white theme (ah). Whilst the Panda algorithm focuses on your onsite factors, Google Penguin focuses on your ‘offsite’ data. The penguin can tell if you are link spamming back to your site, and slap you hard (ooh). If you are still spamdexing, STOP! This includes the following:

  • Link Farms/Sybil Attack
    These are tightly-knit communities of pages (and on different domains) that link to each other.
  • Hidden Links
    Purposely hiding links on the page that cannot be seen by humans, but shows up for search engine spiders.
  • Spam Blogs and Blog/Forum/Comment/Wiki Spams
    Setting up blogs for the main purpose of spamming it with your site links, or spamming other social sites with your links.
  • Mirror Sites
    Multiple sites with conceptually similar content but using different URLs. Since February, Google has also integrated their Latent Semantic Indexing (LSI) which is just a fancy name for ‘understanding synonyms’, so drop your thesaurus now.

Don’t Get Slapped by the Panda and Penguin

Beware… be very aware that any site that falls foul of Panda and Penguin can get crudely wiped out by a site-wide, all-keyword handicap. Instead of playing foul and getting slapped by a future Google Zebra update (imaginary), focus on the good stuff. Unique, interesting, well-structured and quality content will make the animals happy. They will cuddle you back and give you brownie points.

To get some cuddles, make sure your design and user experience is clear and up to date. Our user experience consultant, Sarah, at The ADWEB Agency has written a handy article to help you get started.

Author: Hayden Hsu

Front End Developer and UI Designer at The ADWEB Agency.

Tags: , , , , , , , ,

One Comment

  1. What’s up, just wanted to mention, I liked this post. It was funny. Keep on posting!

Leave a Reply

Required fields are marked *.