6 changes Google hasn’t made (yet) – lessons to learn

July 16, 2014 - Digital State Marketing

The last twelve months have been an exceptionally busy time for Google. The search engine giants have invested heavily in both start-up and established companies such as Nest Labs, DeepMind Technologies, Dropcam and Skybox Imagining. In May it released the latest version of the Panda algorithm in an effort to more accurately target spammy SERP results. Last September’s Hummingbird algorithm marked the latest step towards fully semantic online search. And the rise of “(not provided)” has made browsing more secure and made it more difficult for websites to acquire keyword data.

Google continues to take significant steps towards conquering online spam, shifting the playing field for search marketers and website owners trying to promote their site. Those that offer the best user experience should rank well in the long term, but the most successful players have always been good at anticipating and responding to Google’s modifications and changes.

Moz-Rand-WBF-180414

In this article, we’re looking into some SEO predictions which Google (perhaps surprisingly) has not implemented; focusing, in particular, on six “changes” discussed by SEO Guru Rand Fishkin in a Moz’s Whiteboard Friday earlier this year. We’ll try to explain why they have not happened and what their absence tells us about the present and future of search marketing.

2013 was the year of the “user”

Google’s user-centric view has not wavered. Practically every innovation in the world of online search over the last eighteen months has been aimed at creating a better experience for browsers. Google has prioritised social media, personalised SERP results and introduced more semantic methods of search, for instance. As search marketers it is our job to ensure that online content is valuable and, with the recent shift towards mobile search, that it is accessible and fully optimised. And understanding why Google makes some changes and not others helps us to stay ahead of the game.

In April, Rand proposed six areas where “SEOers” thought Google would change, but have not. So, what are they?

1)      Links from “on-topic” sites matter more than “off-topic”

“A lot of people in the SEO field, and even outside the field, think that it must be the case that if links really matter for SEO, then on-topic links matter more than off-topic links”, says Rand. It is reasonable to think that a link to an article by ESPNcricinfo should be stronger than a link to a betting website on a specialist subject such as “Who should bat at number three for England?” However, there does not appear to be any clear data to support that assumption. “Anyone who has analysed this problem in-depth, which a number of SEOs have over the years – a lot of very advanced people have gone through the process of classifying links and all this kind of stuff – seem to come to the same conclusion, which is that Google seems to really think about links in a more subject, context agnostic perspective”, continues Rand.

If Google were to write off all off-topic links then lots of sites will seek to have the same industry authorities’ link to them. That, of course, would be hazardous because it would make it very difficult for Google to analyse websites from their links. “If they bias to these sorts of things, they would get a very insular view on what’s popular and important on the Web”, he argues. “And if they have this more broad view they can actually get better results”.

2)      Anchor text’s influence would eventually plummet

“Anchor text” is the visible, clickable text in a hyperlink. In most modern browsers, anchor text is blue and underlined. On our website it is just blue, but the cursor turns into a little pointing hand if your hover over it. The anchor text, “our website”, points to /. Simple.

Search engines use anchor text to determine the subject matter of the linked website. And sites can rank better if they have anchor text from relevant sites linking to page on their website that has been specifically optimised for that purpose. In fact, if lots of sites think a particular page is relevant for a set of keywords, then that page will rank well even if the terms do not appear in the text. The system is open to abuse and partly explains the origin of the Penguin algorithm and the practice of manual penalisation for poor quality links.

The importance of anchor text is decreasing, but not at the rate most SEOers have anticipated. Although Google is incorporating new signals all the time, it still seems that anchor text inside a link is much more influential than generic anchor text.

Rand Fiskin believes that Google has not downgraded the importance of anchor text in links because, simply put, the results aren’t as good. “This speaks to the power of being able to generate good anchor text”, he says. “A lot of that, especially when you’re doing content marketing kinds of things for SEO, depends on nomenclature, naming and branding practices. It’s really about what you call things and what you can get the community and your world to call things. Hummingbird has made advancements in how Google does a lot of its text recognition, but for these tough phrases, anchor text is still strong.”wrong-way-301-redirect

3)      302s and other redirects would be treated more like 301s

A “302” is a HTTP response status code which performs a temporary redirection. The most common status codes are “200s”, used when a page or resource is found, and “404s” used when a request resource was not found on the server.

A “301” is a permanent redirection code and the most common type of redirect. They are used when a page has moved and request the search engine or user directing to that page to update the URL in their database.

“302s have been one of these long-standing messes of the Web, where a 302 was originally intended as a temporary redirect, but many, many websites and types of servers default to 302s for all kinds of pages that are moving”, says Rand.

Using 302 server redirects is a dangerous practice and, in short, search engines do not like these sorts of redirection techniques because they are often used by spammers to help their domains rank better in SERPs. 301 redirections are much better practice and help URLs maintain their link popularity. If you set up a 302 or some other redirect, Google assumes that the link is going to be removed, so any new page will lose its popularity and reputation.

Basically, there are too many 302 redirects on the web because they tend to be misused. They should not, for example, be used when changing the domain name of a site. Google regards it as spammy practice and might block your new domain. And, if you want several domains to direct to the same place (such as with domain spelling errors), you should use a 301 permanent redirect.

4)      Rel=Canonical would become more of a “hint” and less a directive

A rel=“canonical” is a tag/code snippet that can be inserted into the header of an HTML to tell the search engine the relationship of a page to others on a site. They help search engines to properly index and award credit by identifying duplicate and original pieces of content. “When rel=”canonical” was first launched, Google said that rel=”canonical” is a hint, but we won’t necessarily take it as gospel”, Rand says. “Yet, every test we saw, even from those early launch days, was that they are taking it as gospel. You throw a rel=”canonical” on a trusted site accidentally on every page and point it back to the homepage and Google suddenly doesn’t index anything but the homepage. It’s crazy!

Google appears to regards rel=“canonical” as a normal directive. So, it’s important to be careful when implementing it because misuse can result in lots of pages being inadvertently de-indexed. The best advice would be to use 301 redirects whenever you can.

5)      Shares from trusted/important/influential social accounts would have a more direct and observable impact on rankings

The best content on the web is shared on social media. People use Facebook and Twitter to find out information about their favourite brand or product. “The common man’s link graph has become the social web and the social graph”, argues Rand. People “share”, “like” and “retweet” the highest quality and most important content on the web – it’s relevant to them, so why shouldn’t it influence their search results?angry-mattcutts-socialsignals

Google gives some degree of weight to Google+ (especially in personalised results), but despite a two year period when Twitter impacted very directly on search results, on the whole, Google seems to have reduced the importance of social media. Rand’s explanation is that “the good social sharing, the stuff that sticks around, the stuff that people really feel is important is still, later on at some point, earning a citation, earning a link, a mention, something that they can truly interpret and use in their ranking algorithm.”

For Google, social media sharing acts as a kind-of “tip-off” for a strong brand or piece of content. It will inevitably gain some popularity which will then be reflected in the link graph. Social signals are more open to manipulation and, as Rand says, Google does not have API-level access and partnerships with Facebook and Twitter, which explains its reluctance to incorporate their signals more fully into SERPs.

6)      Google would take more clean-up action on hyper-spammy keywords niches in PPC

That’s porn, pills and casino; not pay-per-click! In recent years Google has invested heavily in eradicating spam. They have introduced the Penguin, Panda and Payday Loan algorithms in an effort to target spammy search results, “black hat” SEO tactics such as keyword stuffing and unnatural link profiles, and ad-heavy pages. It’s so important they have their own Webspam team.batman-quality-score-slap

However, Google has not taken much action on hyper-spammy gambling and pornography sites. “You can see a lot of the changes that Google’s making around spam and authority and signal interpretation”, Rand concludes. “One of the most interesting ones is that a lot of those hacked .edu pages or barnacle SEO that was happening on sub-domains of more trusted sites that had gotten a bunch of links is ending a little bit.” Google is taking the right steps, though. For example, it’s getting better at distinguishing the quality of links and the authoritativeness of pages that might be “clinging” to a domain, but are poorly linked to internally on more trusted sites.

Our best guess is that a huge PPC clean-up operation is probably in the works.

It’s not the case that Google does not have the technology or tools to implement these changes. In most cases, they probably don’t want to because they would adversely affect SERP results and, therefore, our experience as users.

The majority of changes they make are to either improve their search results for users, or target spammers and those attempting to exploit the algorithm. Google incorporated localised results, search history and signals from Google+ to make our results more personalised. Whereas anti-spam initiatives such as Penguin, Panda and Payday loan have sought to reward good SEO practice and punish “cheating” websites.  SEOers need the strategic know-how to anticipate the direction Google will go next and, ultimately, get the right information on the screens of the relevant user.

Google champions “white hat” SEO and the most important thing websites can do is to pay close attention to the priority areas that are evolving rapidly, particularly mobile and semantic search. It’s important to have some idea why Google makes and doesn’t make certain changes, but if websites concentrate on providing the best experience within their niche they should be able to depend on Google rewarding them. Remember: work hard, stay on top of the SEO game and greater organic search visibility is just around the corner.

Leave a Reply

Your email address will not be published. Required fields are marked *