Recent Developments in SEO

For a long time, search engines found and indexed pages based upon the prevalence of certain keywords. Those keywords would appear within the page’s content itself as well as on other sites, with the keyword phrase linking back to the original site. This was the earliest function of backlinking, and it worked very well for many years.

In fact, the system worked too well, making it easy to exploit.  A website could choose a few keywords that were linked to valuable AdSense ads, slap together some content around those words, and utilize software to generate hundreds or thousands of backlinks to that original page.  Each link to the site would act as a vote, telling the search engines that the site was the best match for that keyword, and at the time the quantity of those links mattered much more than quality.  In fact, the sites with the backlinks in them were never designed to be read by humans; they existed solely to be found by search engine spiders.

The keyword-addled page would rise in the search engine ranks, gaining lots of clicks and garnering plenty of income for the developer.  This was good news for some developers, but it wasn’t good for users: Finding real, valuable information about a topic became increasingly difficult as the Web filled with these shallow sites.

To penalize these sites and allow truly valuable content to rank highly, Google set to work rebuilding the way their searches worked.  The search algorithm was refined over several updates since 2011, including Panda, Penguin and, most recently, Hummingbird.  Each update brought something new to the table, but the underlying message was the same: Google would no longer tolerate sites with poor quality content, and it would reward sites with authority.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Time limit is exhausted. Please reload CAPTCHA.