Over the past few months, Digital State Marketing has been creating articles on the theme of site usability. Last month the focus was on mobile optimisation and it raised an interesting point about the potential of a keyword-less future. So this month, we thought it would be interesting to take a short diversion from the main theme to look at semantic search and how this may affect the choice of keywords that your search engine optimisation company may recommend in the future.
A very basic summary of the concept of the semantic web is that the coding of websites should be more aligned. Historically, the web has grown by individuals creating content in whichever language they have learned and then added a healthy dose of their own interpretation. The aim of the World Wide Web Consortium (W3C), is to create an internationally accepted common data format (for a more in-depth summary, check out the Wikipedia page).
So the aim is to make it so that computers can use the web to answer questions, in the same manner that humans do. For example, when a person asks another person “where’s the best chippy in town?”, there is a great deal of presumed (common) knowledge in both parties, such as the colloquial language knowledge that “chippy” is a slang term for fish and chip shop. There are also geographical considerations, along with subjective opinion, in order to select the best option and provide directions.
A unified semantic web would mean that the information is presented in such a way that a computer can recognise elements of this sort and provide a more nuanced answer. In response to the above, code allowing the computer to find the location and type of shop, coupled with a review site scoring system (such as yelp) could lead the computer to deliver a result for the chippy with the best reviews in town.
However, this potential version of the semantic web is a long way down the line, as the sheer bulk of the web (and even the very nature of it) means that a unified system of fair play is unlikely to be fully accepted.
To bring this back to search engine optimisation, the major search engines have grouped together to set up a unified mark-up system for HTML (called schema.org) with the view of allowing sites to implement a system of structured data, with the lure that their sites will become more accessible in the SERPs (admittedly, a long step away from a unified coding practice but a step in the right direction).
For example the information in the red box, has been marked-up with Event Schema (and, below it, is the Schema.org headers that have been implemented).
So how may this potentially affect the future of keyword choice for a search engine optimisation company?
Well, with users being able to state what their content is about, it’s plausible that a more unified database would begin to draw an extremely sophisticated picture of how all the content inter-relates.
Take for example the Google SERP for “Dolly Parton”. You will note that while there is data pulled from Wikipedia (a trusted online database that Google draws from), it also offers “people also search for” options. The database has categorised Dolly by her genre and period of activity to offer the user a range of artists that match a similar profile. While this is a limited use at the moment, the logical conclusion is the database will grow to the point that everyday phrases also have associated connotations and synonyms will become key. The meaning of content will be judged as a whole piece rather than, as currently, a collection of separate words.
As I say, this potential future is some way off but it’s clear that search engines are seeking to achieve a deeper understanding – in part to provide more meaningful solutions to searches. As members of the digital future, we need to ensure we are accurately representing ourselves for the best marketing purposes.
If you are looking for more information on implementing schema markup on your website, get in touch with the team at Digital State Marketing.