Keeping up with the Pandas: SEO & Content Development

December 11, 2014 - Digital State Marketing

It’s been a few months since Google introduced Panda 4.0 and the dust has well and truly settled. So what have we learned? Who have been the winners and losers? And, if your website has been hit, what can you do to reverse your fortunes?

“4.0” is the latest version of Google’s spam busting, content probing and anti-ad algorithm. A couple of days prior to its release, Google also updated its “Payday Loan” algorithm in another effort to more accurately target spammy SERP results.  So, it’s been a busy time for search marketers, an anxious time for small “honest” brands and, frankly, a rough time for websites that attempt to cheat Google’s system.

photo courtesy of Shintaro Kakutani

photo courtesy of Shintaro Kakutani (Creative Commons)

For the past three years, Google has been regularly modifying Panda in a series of incremental updates. Google’s webmasters have become incredibly efficient at cracking-down on undesirable “black hat” SEO tactics. And Panda 4.0 is the newest tool to help eradicate spammy online content, punish websites plastered with meaningless advertising and, ultimately, ensure the best possible web experience for users. 

Panda 4.0: what’s it all about?  

Google is at war against spam. Algorithmic intricacies have always been kept under wraps; however, earlier this year head of Google’s webspam team, Matt Cutts, announced that they were working on a “softer” version of Panda, intending to help smaller brands do better in search engine results.

4.0 is the largest update since Panda was first launched in February 2011 and it impacted on approximately 7.5% of English-language search queries seeking to prevent low-quality and thin content from ranking highly in search results.

Who are the winners and losers?

Small websites owners have (or should have) been busy monitoring their organic search visits and Google Analytics statistics. There has been clear disruption in some industries. “ebay.com”, “ask.com” and “biography.com” all witnessed a 33% reduction in traffic (although eBay’s drop may have been the result of a manual action); while niche sites such as “emedicinehealth.com” and “medterms.com” experienced a whopping 500% increase in traffic.

It also seems to have targeted press release sites. The press release distribution website “prnewswire.com”, for example, has shown a 63% drop in visibility. “Prweb.com” experienced a 71% drop in traffic and “businesswire.com” suffered a 60% decline. So, fairly unpleasant reading for those concerned.

In short, PR sites have been hit hard because they tend to adopt spammy link-building tactics. They are often saturated with owned content and low quality links; not the best way to make friends with Google!  Indeed, last year Google showed its intentions by releasing a new webmaster guideline encouraging the removal of links with optimised anchor text in press releases.

How has Panda evolved over time?

The first instalment of Panda was introduced in February 2011 in an effort to “encourage a healthy ecosystem”. It impacted 11.8% of queries by reducing the visibility of pages with poor, unoriginal and unusual content (especially content farms), and promoting sites with original and engaging material.

In April 2011, Panda was released worldwide with version 2.0. It, too, focused on content but its total impact was considerably broader and took into account how some sites are blocked by users.

Panda 3.0 arrived in November 2011 and since then Google has continued to roll out regular updates. Most have been relatively minor adjustments, only affecting a small percentage of search queries. There have been some exceptions, however. The June 2012 (3.7) update, for instance, seemed to affect rankings far more significantly than the 1.0% initially reported by Google. And, in July 2013, Google introduced a “softer” version of Panda, as it appeared to reverse some of the previous penalties and added signals to the authority of a niche website to prevent useful sites from being downgraded.

Panda 4.0 continues to emphasise the importance of quality content. Prior to Panda and Penguin – Google’s two big-hitting algorithms – it was much easier for websites to get results by tailoring their strategy for search engines rather than users. Black hat SEO tactics such as keyword stuffing, content automation and hidden text were far more commonplace; and search marketers were ridiculed for initiating a “clickbait endemic”. Times have changed.

Has your website been hit by 4.0? If so, what can you do?

In September last year, after Google integrated Panda into its indexing processes, Matt Cutts offered some advice on how to determine whether your site has been affected by Google’s algorithm and, if so, how to recover. “If you think you might be affected by Panda, the overriding kind of goal is to try to make sure that you’ve got high-quality content”, he said. “The sort of content that people really enjoy, that’s compelling, the sort of thing that they’ll love to read that you might see in a magazine or in a book, and that people would refer back to, or send friends to, those sorts of things.”

Nowadays, the rewards for outwitting Google’s algorithm tend to be short-lived. Google is continually adapting and it’s the job of websites and search marketers to change with it. It has becoming a tired cliché, but websites really should focus on creating the best experience for users rather than trying to exploit some loophole in Google’s search engine.

explore, create, share

photo courtesy of Denise Carbonell (Creative Commons)

“Content marketing” has emerged as an essential pillar of search marketing. In short, Google punishes duplicated content (whether onsite or offsite) and rewards original content. First and foremost, website content needs to be high quality, relevant, valuable and engaging. Rather than cramming content with keywords, it should be made informative, appealing and entertaining. It should be tailored for specific users; it needs to solve their problems, answer their questions and appeal to their tastes.

Secondly, companies should concentrate on becoming a topical authority within their own niche.  Authoritative content on a specific subject – rather than a basic overview – boosts website visibility. For instance, if you produce several articles on a certain subject you are considered more reputable and might become a “topical authority content site”. In other words, brands should determine their target audience and cater for it.

Thirdly, sites should look to improve, update or remove old, low quality and duplicate content. It’s important to eliminate grammatical and spelling errors, and is a good idea to employ more reader friendly publication methods. Using smaller paragraphs, headings, bullets, images, videos are effective ways to help engage readers with your content – it should be easy to find, navigate, read, engage with and share.

Panda 4.0 helps Google measure website authority. But what actually defines authority and is Google getting better at distinguishing it?

New signals are emerging all the time as search engines evolve and adapt, therefore the way authority is measured is constantly changing.  Since the arrival of Panda in 2011, hundreds of reputable and authoritative sites have been (perhaps mistakenly) downgraded. The newest version of Panda employs up-to-date methodologies, however. It does not seem to reward short term strategies and quick-gained authority (such as authorship and duplicated content), but long term good practice. It also realises that while authorship can be valuable, it can also be gamed, and so does not always considered it the top factor when determining authority.

Websites that have been consuming lots of duplicated content have experienced huge ranking drops. In his research, online marketing consultant Glenn Gabe concluded that sites which had syndicated content tended not to use optimal technical setup.  They had not used “rel=canonical to point to the original content on another domain (via the cross-domain canonical tag)”, he writes. “Some content had links back to the original content on third-party websites, while other pieces of content did not. Also, the content could be freely indexed versus noindexing the content via the meta robots tag.”

Gabe’s research highlights the importance of properly managing syndicated and “scraped” content. You shouldn’t use too much and, if you do, it needs to be properly attributed. In some circumstances, it might be better to reduce the number of indexed pages on a website. But, if you dilute your duplicated content and work on original material and you should see positive results.

Google also doesn’t seem to like sites that syndicate their original content to other partners, especially if attribution is not handled properly. And you don’t want someone out-ranking you because they are using your content! “If you’re going to syndicate your content, make sure the websites consuming your content handle attribution properly”, says Gabe, “In a perfect world, they would use rel=canonical to point back to your content. It’s easy to set up and can help you avoid an attribution problem.” If a site is duplicating your content, then they can “noindex” it to keep it out of Google’s index, which enables them to promote that content on their own site but it won’t be found via search. Win-win!

Panda 4.0 has also continued to downgrade sites with low engagement levels. We all hate poor user interfaces, slow loading pages, awkward usability, affiliate jumps, risky downstream links and thin content. It’s, therefore, a wise idea to implement adjusted bounce rate (ABR) through Google Analytics because it takes into account the time users spend on a certain page, helping to measure engagement levels and pinpoint effective bits of content.

What about keyword stuffing?

In the past, some websites ranked well for thousands of keywords in a competitive niche without having particularly strong content. Panda has reduced the visibility of sites with few indexed pages, but might have previously ranked well for lots of keywords. Conversely, sites with stronger content targeting long tail keywords have experienced increased visibility …more good news for hard working publishers.

Google wants to provide the best matches for search queries. In other words, if a user searches for something specific, they should get a specific match from a specialist website.

If your website has dropped down Google rankings because third parties are targeting specific queries more successfully, then it’s definitely time to reassess and strengthen your content strategy. The most successful websites target more than just head-terms and provide users with content they want (not what they think Google wants). Furthermore, you shouldn’t rely on your homepage to rank for all your target keywords; high quality, well indexed content does that much more effectively.

What should you do if you’ve been hit by Panda? 

Traffic rates typically fluctuated following an algorithmic changes, however, if you believe that you’ve been negatively impacted it’s time to act – quickly. If you’ve been positively impacted you should not just rest on your laurels – build on the good work!

Carrying out a “Panda audit” is the best place to start. You need to thoroughly analyse your site to find and rectify problems. Detect low quality content, technical issues and usability problems, and resolve them. Panda is updated monthly, so there are plenty of opportunities to recover without too much damage. Keep producing high quality content, engaging with customers through social media and driving referral traffic.

How about if you’ve seen an increase in traffic? 

A strong SEO strategy will guarantee long term results, but the most important thing is to understand how and why your site is doing well. Google Analytics is a great tool for pinpointing content which works best. It also allows you to review your site’s keywords, helping you to identify how Google is driving traffic to certain pages.

Moreover, content should be frequently crawled and analysed. If you’re a website owner, make sure you patch up poor quality content and technical problems on a regular basis. Stay vigilant, cater for your target market and keep producing more of the same.

Final Thoughts

The story of Panda reflects the recent story of search engine marketing. The online world continues to evolve at an incredible rate, particularly with the emergence of mobile, semantic and more personalised forms of search over the last few years. Google has always championed both the best SEO practices and the best interests of users – and Panda 4.0 is its latest, and fanciest, tool to ensure it continues to do so. We can certainly be sure that, in the future, press release sites will contain the higher quality content that consumers are looking for. Great news!

As the online marketplace becomes even more competitive it’ll become even tougher to rank highly in SERPs. A website’s SEO needs to be tailored for the multi-channel format of today, including social media, PR, PPC and mobile. However, more than anything else, Panda 4.0 stresses the importance of quality content. If you, as a website publisher, can identify your own niche and regularly produce tip-top content that is useful and relevant for your target market then you can’t go far wrong. After all, Google is, first and foremost, trying to promote organic, high quality and helpful content for users – and that can only be a good thing.

Leave a Reply

Your email address will not be published. Required fields are marked *