top of page

Google: Guardian of the World Wide Web?

When Google was first launched as a search engine in 1998, it fulfilled a great purpose: it helped make the information stored on the world wide web accessible and easy to find. Google’s link-based ranking algorithm resulted in much more relevant and high quality search results than other search engines could produce at the time. This gave Google an advantage that it has capitalised on ever since.


But Google was not content to just passively crawl and index all the content it could find. The moment Google realised that webmasters cared about top rankings in Google – cared a lot in fact – they also realised that webmasters would do their very best to adhere to whatever guidelines Google set out as ‘best practices’ for ranking highly in their search results.


Over the years Google has made many recommendations to webmasters about how websites should be structured, how content should be marked up and optimised, how sitemaps should be used, and so on. And, as Google’s marketshare in search grew and the search engine began to dominate the web as the default start page for almost every conceivable query, webmasters have put more and more effort in to adhering to Google’s every guideline.


Here are some of the SEO guidelines Google has been proclaiming that have had a profound impact on the way websites are built and structured, and how content is being written and shared:


* No content embedded in Flash, JavaScript, images, etc One of the oldest edicts of Google rankings has been to not ‘hide’ content inside code that Google can’t read. Things like flash-based websites, JavaScript-powered navigation, and text embedded in images are big no-nos on the web nowadays, simply because Google can’t read and index that content.


* Websites need to load fast For years Google has been saying that websites need to have a short load time, and that slow loading websites are likely to see some sort of negative impact on their search rankings. So webmasters devote a lot of effort to making their websites fast to load.


* Structured data In recent years Google has been encouraging websites to implement structured data, especially schema.org, to mark up information. The reward for implementing structured data is so-called ‘rich snippets‘ in search results that could provide a competitive advantage.


* Responsive websites For a long time a separate mobile site was deemed the best solution for enabling a website to work on smartphones, followed closely by separate mobile apps. But Google decided it preferred to see so-called responsive websites: sites that adapt their layout and structure to the size of the screen it’s being shown on. As a result responsive design has become the de-facto standard of building websites.


* Human-readable URLs One of the eternal areas of friction between developers and optimisers is the URL structure of a website. Parameter-driven URLs are often easier to implement, but Google prefers human-readable URLs (and preferably one URL per piece of unique content), which leads to all kinds of extra hoops for developers to jump through when building complicated websites.


* Nofollowed links Since Google first introduced support for the rel=nofollow tag back in 2008, the recommendation of when to use it has significantly broadened in scope. Nowadays webmasters are encouraged to nofollow every link they don’t personally vouch for, and can see their sites penalised if they don’t.


* SSL on all sites The latest guideline – unconfirmed as of yet – is that Google wants to see all websites use SSL encryption, and that sites without SSL will be ranked lower in Google’s search results. If this becomes official policy, no doubt SSL certificate providers will be laughing all the way to the bank.

Nearly all of the guidelines listed above have had – or will have – a profound impact on standard practices in web design & development. And it would be fair to say that nearly all of these guidelines result in a better user experience for people on the web.


But Google’s motives are not exactly altruistic. It is of course a company devoted to profit maximisation, and these guidelines almost always have a benefit for Google itself:

  1. Full indexing of all content: by ensuring websites do not ‘hide’ content in Flash or script languages, and that the content is marked up with structured data, Google can crawl and index more of the web and make sense of the content more easily.

  2. Faster crawling & indexing: fast-loading websites containing structured data, single human-readable URLs for all pieces of content, and no separate mobile sites all ensure that Google’s crawling & indexing systems can perform more efficiently and crawl the web faster and using less resources.

  3. Clean link graph: by encouraging webmasters to use the nofollow tag where there is doubt about the editorial value of a link, Google can outsource much of the filtering of the link graph to webmasters. The result is less spam in the link graph for Google to sort out.

The main issue with all of the above is how Google’s guidelines are becoming the standard way of doing things on the web. By virtue of its immensely dominant power in online search, few companies can afford to ignore Google and do things their own way.


And even when a company decides to focus its attention on building a powerful offline brand, thus reducing their reliance on Google, the search engine still finds ways to capitalise on the brand’s popularity, as evidenced by these examples of brand name searches on Google:


Google insurance brand SERPs

Click image for larger version


The inevitable end result is that Google’s ever-changing guidelines, no matter what their basis – be it improved usability or Google’s own profit-maximisation – will become web standards, and websites that fail to adhere will be ‘punished’ with lower rankings in Google’s search results. That in turn leads to lower traffic figures, sales, profits, etc. It’s a downward spiral with only one winner: Google itself.


Obviously this is not how the web was envisioned. The world wide web as invented by Tim Berners-Lee was intended as a platform of liberation, of free flowing information and collaboration imbued with an ‘anything goes’ mentality that has enabled tremendous innovation.

Increasingly, the web is now becoming enslaved to the whims of a few big corporations, with Google leading the pack. Governments are not alone in threatening the very foundation of the web (though they certainly do, albeit in a very different way). The world wide web is being forced to slavishly adhere to the mood-swings of a handful of mega corporations that serve as the portals to the vast wealth of content published on the web.


The question is, are we content to let profit-seeking corporations decide for us how the web should be, or can we reclaim the web’s free spirit and anarchic roots to allow us to make our own destinies online?

bottom of page