2012 has been an eventful year for SEO professionals. Perhaps it didn’t show quite as much upheaval as 2011’s string of Panda updates that left SEO shaken and fighting to recover, but it was eventful nonetheless. Google continued to shake up the industry, rolling out minor changes several times a month in the constant effort to put down spammers and promote useful content.
As the year comes to a close and we look towards 2013 with hopeful eyes, it’s worth it to keep an eye on the past to see what has gone before. Google has some very definite trends in the works, and they will no doubt continue into the coming year the same way Panda and its ilk stepped into 2012.
Panda, Penguin and Beyond
For the first three months of 2012, Google was relatively placid. They continued to update and expand upon the Panda updates, but for the most part the algorithm changes were mild. There was an update that focused on local search results, which are always of benefit for small businesses and their SEO. Beyond that, however, the next major update happened in late April.
The Penguin update worked to counteract many of the most prevalent black hat SEO techniques. This was an effort to counteract “over-optimisation,” the act of putting so many keywords and links into a single page that the legitimate content is eclipsed. Many of the sites that use keyword and link spam, of course, were trying to abuse the system and were punished for it.
Through the year, both Penguin and Panda continued to be updated. Google also performed other updates here and there, including a widespread devaluation of exact match domain names.
Eliminating Scrapers and Spam
Google’s goal with all of their algorithm changes has always been to promote good quality content and devalue those who would spam sites, copy content and game the results. The entire SEO industry must keep up with their changes in order to benefit in marketing.
2012 was a year of action against scraper pages. Every action in Penguin was geared towards making it harder to copy content and benefit from it. Keyword stuffing, even accidental, was grounds for a lower pagerank. Likewise, if your site has too many pages with too-similar content, it could see a drastic hit in ranking — which is a problem for stores that have many product pages. Avoiding these techniques and analyzing pages to make sure they work within the new standards and guidelines was an essential reorganization for affected sites.
Focus on Quality Content
The other side to the Penguin updates is the motivation behind them. Google works hard to ban scrapers and low quality content, but what is it they focus on promoting? The answer is high quality content. Pages that provide useful information to the users should naturally come at the top of search results, higher than pages that offer less information but know how to game the system.
Creating high quality content is more complicated than some might suspect. The rules begin at the very basics of the English language. Articles must have correct spelling and grammar. Formatting must make sense. Long walls of text should be broken up with paragraph breaks and subheadings. The easier it is for a searcher to scan for the information they want, the better.
High quality content extends to the site layout as well. Stuffing content with keywords no longer works, and can actively hinder pagerank. Links need to flow naturally and link to other high quality sites. Meta details are important, but follow the same content rules for keywords. In general, it’s becoming harder to spoof the system.
Google AuthorRank and Google+
Formerly, social networks had a reasonable impact on a page’s ranking. In 2012, Google has lessened the influence that Facebook, Twitter and the like have. Meanwhile, they are promoting Google+, their own social network. Any business needs to have a full and active Google+ account in order to take advantage of social networking.
Part of the reason Google emphasizes Google+ is because it’s a social network they control. They can use it as a resource for content creators, which ties into their upcoming AuthorRank system. Under AuthorRank, your site can gain benefits based on your own reputation. Conversely, content creators known to use black hat techniques will struggle much more to promote their content.
AuthorRank acts as a compliment and a foil for the existing PageRank system. A pagerank rates a site based on its relevance to a query — therefore the page can have as many rankings as there are keywords, with the vast majority rated at irrelevant. AuthorRank is one ranking that ties the content and the author together. If the content is high quality, the author gains a positive reputation. An author’s positive reputation will act as a benefit for any page they write.
AuthorRank is not yet required for excellent SEO, but the smart SEO professional will begin implementing it early.
Here is a great blog post about Author Rank – http://www.highervisibility.com/blog/seo-preparation-for-2013-build-up-your-author-rank/
Quality Links: Incoming and Outgoing
Links have always been important, but now the quality of the link and the originating site is just as important as the existence of the link in the first place. The days of networks of scraped blogs generating backlinks are gone. Those kind of low quality links do more harm than good now.
Analysis of your incoming backlinks is important, and if you can remove low quality incoming links, you stand to benefit.
Maintaining a Constant Flow
Keeping up with the times is important in SEO in two ways. Evergreen content is good, but many searchers want constant relevance. A constant flow of new high quality content is essential for any SEO service. Google likes sites that keep themselves up to date with the latest useful information.
Likewise, keeping up with the trends of SEO is important for any SEO professional. Google makes many of their changes with little warning, and the field is truly adapt or die. For those large changes coming in the works, like a total implementation of AuthorRank, a head start is invaluable.