Google: Guardian of the World Wide Web?

  • Buffer
  • Buffer

When Google was first launched as a search engine in 1998, it fulfilled a great purpose: it helped make the information stored on the world wide web accessible and easy to find. Google’s link-based ranking algorithm resulted in much more relevant and high quality search results than other search engines could produce at the time. This gave Google an advantage that it has capitalised on ever since.

But Google was not content to just passively crawl and index all the content it could find. The moment Google realised that webmasters cared about top rankings in Google – cared a lot in fact – they also realised that webmasters would do their very best to adhere to whatever guidelines Google set out as ‘best practices’ for ranking highly in their search results.

Over the years Google has made many recommendations to webmasters about how websites should be structured, how content should be marked up and optimised, how sitemaps should be used, and so on. And, as Google’s marketshare in search grew and the search engine began to dominate the web as the default start page for almost every conceivable query, webmasters have put more and more effort in to adhering to Google’s every guideline.

Here are some of the SEO guidelines Google has been proclaiming that have had a profound impact on the way websites are built and structured, and how content is being written and shared:

* No content embedded in Flash, JavaScript, images, etc
One of the oldest edicts of Google rankings has been to not ‘hide’ content inside code that Google can’t read. Things like flash-based websites, JavaScript-powered navigation, and text embedded in images are big no-nos on the web nowadays, simply because Google can’t read and index that content.

* Websites need to load fast
For years Google has been saying that websites need to have a short load time, and that slow loading websites are likely to see some sort of negative impact on their search rankings. So webmasters devote a lot of effort to making their websites fast to load.

* Structured data
In recent years Google has been encouraging websites to implement structured data, especially schema.org, to mark up information. The reward for implementing structured data is so-called ‘rich snippets‘ in search results that could provide a competitive advantage.

* Responsive websites
For a long time a separate mobile site was deemed the best solution for enabling a website to work on smartphones, followed closely by separate mobile apps. But Google decided it preferred to see so-called responsive websites: sites that adapt their layout and structure to the size of the screen it’s being shown on. As a result responsive design has become the de-facto standard of building websites.

* Human-readable URLs
One of the eternal areas of friction between developers and optimisers is the URL structure of a website. Parameter-driven URLs are often easier to implement, but Google prefers human-readable URLs (and preferably one URL per piece of unique content), which leads to all kinds of extra hoops for developers to jump through when building complicated websites.

* Nofollowed links
Since Google first introduced support for the rel=nofollow tag back in 2008, the recommendation of when to use it has significantly broadened in scope. Nowadays webmasters are encouraged to nofollow every link they don’t personally vouch for, and can see their sites penalised if they don’t.

* SSL on all sites
The latest guideline – unconfirmed as of yet – is that Google wants to see all websites use SSL encryption, and that sites without SSL will be ranked lower in Google’s search results. If this becomes official policy, no doubt SSL certificate providers will be laughing all the way to the bank.

Nearly all of the guidelines listed above have had – or will have – a profound impact on standard practices in web design & development. And it would be fair to say that nearly all of these guidelines result in a better user experience for people on the web.

But Google’s motives are not exactly altruistic. It is of course a company devoted to profit maximisation, and these guidelines almost always have a benefit for Google itself:

  • Full indexing of all content: by ensuring websites do not ‘hide’ content in Flash or script languages, and that the content is marked up with structured data, Google can crawl and index more of the web and make sense of the content more easily.
  • Faster crawling & indexing: fast-loading websites containing structured data, single human-readable URLs for all pieces of content, and no separate mobile sites all ensure that Google’s crawling & indexing systems can perform more efficiently and crawl the web faster and using less resources.
  • Clean link graph: by encouraging webmasters to use the nofollow tag where there is doubt about the editorial value of a link, Google can outsource much of the filtering of the link graph to webmasters. The result is less spam in the link graph for Google to sort out.

The main issue with all of the above is how Google’s guidelines are becoming the standard way of doing things on the web. By virtue of its immensely dominant power in online search, few companies can afford to ignore Google and do things their own way.

And even when a company decides to focus its attention on building a powerful offline brand, thus reducing their reliance on Google, the search engine still finds ways to capitalise on the brand’s popularity, as evidenced by these examples of brand name searches on Google:

Google insurance brand SERPs

Click image for larger version

The inevitable end result is that Google’s ever-changing guidelines, no matter what their basis – be it improved usability or Google’s own profit-maximisation – will become web standards, and websites that fail to adhere will be ‘punished’ with lower rankings in Google’s search results. That in turn leads to lower traffic figures, sales, profits, etc. It’s a downward spiral with only one winner: Google itself.

Obviously this is not how the web was envisioned. The world wide web as invented by Tim Berners-Lee was intended as a platform of liberation, of free flowing information and collaboration imbued with an ‘anything goes’ mentality that has enabled tremendous innovation.

Increasingly, the web is now becoming enslaved to the whims of a few big corporations, with Google leading the pack. Governments are not alone in threatening the very foundation of the web (though they certainly do, albeit in a very different way). The world wide web is being forced to slavishly adhere to the mood-swings of a handful of mega corporations that serve as the portals to the vast wealth of content published on the web.

The question is, are we content to let profit-seeking corporations decide for us how the web should be, or can we reclaim the web’s free spirit and anarchic roots to allow us to make our own destinies online?

Comments

  1. By on

    Hi Barry,

    Great Post all in all and some important points. Now being the pain I am, I feel the need to come back with some points and questions… you know me well enough to know I always play devils advocate ;)

    You mention about Google not wanting content embedded in things like flash, javascript and images – now whilst flash is the devils handy work and it should only be used for videos and games I agree no real content should exist in it. Javascript and Images however CAN be indexed pretty well these days, unless you are doing something funky. A great example is the use of Javascript for giant menus (which I hate, but do exist) and creating sliders etc, this content can all be indexed fairly easily – its only when you start to do things like use javascript for search that it falls flat on its face. Images and text within them can be indexed in most cases where the image and text is clear enough – im pretty sure there was a labs feature (may still be there) last year where you could search for an image containing certain text – its only like being able to search by accent colours and various other filters, images with text can be indexed.

    The above are more frowned upon because of accessibility issues and Google’s mantra is to always give the user a great user experience and by having text on images in huge amounts or giant flash banners isnt remotely user friendly these days.

    Fast loading websites, I need to dig out the paper I once wrote on the fact that Moore’s Law on the web has limits. So far websites in conjunction with increase bandwidth mean that Moore’s law pretty much holds up – in general. But over the next few years this will go backwards as people fill their websites with huge images that aren’t just massive in dimension but in size (i see sites fairly often with images loading that are 5mb, and are about 300px wide!). Just because we have fast internet and can upload huge amounts doesn’t mean that we should and that every other user in the world has the same access to speedy web. It again comes back to user experience and Google in this respect doesnt care about those of us on a fibre connection at 100mb, it cares about the person on 3g in the UK or the folks in africa with satellite internet that is as slow as dialup ever was – Its the bigger picture and looking after speed helps everyone. In addition to this it also means that there is less data transiting networks and there is more free bandwidth which can help everyone’s internet speed up (in theory).

    Google said, and i believe Amazon backed it up, at the Catalyst Conference last week that for ever extra second of loading time you are likely to loose 7% of conversions to your website. Now if that isn’t reason to speed up your site I don’t know what is.

    SSL – honestly I would love all websites with an “transaction” – even just leaving a comment like this- to be using SSL certificates. That tiny bit of extra security doesn’t slow a website or server down, unlike myths would have you believe, and it can really help with security and no doubt help customers feel secure and make it less likely that a man-in-the-middle attack can happen (and given we all surf more and more on public wifi hotspots that is a huge danger today).

    I don’t want to see it as a ranking factor but next to the rich snippets the word “secure” wouldn’t go a miss.

    just my thoughts :)

    hopefully see you at Brighton?

    Reply »

  2. By on

    @Andy: yes like I say in the post, a lot of these guidelines do result in better user experience. But that’s not why Google pushes out these recommendations – there are lots of things websites should do to provide better experiences, but Google remains silent on most of them because it doesn’t benefit from those itself. Only when user experience and Google benefit aligns does Google make a recommendation.

    The worrying trend is that, with webmasters so eager to comply to Google’s whims, at some stage the line between what’s best for the web and what’s best for only Google gets very blurred. One might argue the line is already blurred beyond recognition, with for example the rel=nofollow tag… and that’s not a good thing.

    And yup, I’m definitely at BrightonSEO – see you there!

    Reply »

  3. By on

    Hi Again,

    I agree that some of this is for their own benefit, but some of it isn’t. Faster loading means yes they probably have less to store across their network, can serve faster results and probably get more ad clicks – but we all love fast results pages. At the end of the day Google is a private company and is making money for its shareholders as any good business should – if you want free and whimsickle but not as good give DDG a go …

    I for one dont jump up and down on every update thats announced or every little trend, the bigger picture should always be the winner – the key to SEO and digital marketing (as you know) is not that one small thing fixes everything. Maybe we as digital marketing folk should be more concerned about education around that and less about google blurring the lines – everyone knows that if you are not paying for a service or product you are the product (aka you will be sold ads to)

    - see you next week!

    Reply »

  4. By on

    @Andy, re ssl, you need to remember that SSL does prove to be a serious issue for people on satellite connections and even dial up. The extra handshakes at every move have a significant impact on their ability to do anything on the web. Not everyone has a broadband connection capable of streaming 3 movies on Netflix, play COD, and upload videos of their kids to Facebook at the same time. Some of us are happy to just be able to see a page load, let alone load fast.

    SSL has it’s place, but it’s like locking the door to the house every time you step outside to check the grill ‘just in case’.

    Then you have Javascript. I recall a time hen Google said, put your ad links in Javascript, we can’t read those anyway if you would rather not nofollow them (or can’t nofollow them). Then Google learned how to read javascript.

    We crossed a line not too long ago and every single one of my clients has developed a rather strong dislike of Google and are actively seeking alternatives for everything they do these days. I wonder how much longer it will take before the rest of the internet follows suit?

    Reply »

  5. By on

    SSL as an extra isn’t that much one handshake extra isnt that much of a big deal, what is more of a big deal on connections such as your own that you mention is huge images and extra DNS checks that can take a while. Site speed is always importnat. My want for SSL is born of security want and as i say long term everything short term i’d be happy if all big services, forums, ecommerce sites used it.

    Reply »

  6. By Aaron Bradley on

    “The question is, are we content to let profit-seeking corporations decide for us how the web should be, or can we reclaim the web’s free spirit and anarchic roots to allow us to make our own destinies online?”

    That is indeed the right question, Barry. Unfortunately whether or not we’re “content” with this situation is essentially irrelevant; what matters is if we’re able to do anything about it if we’re not “content.” And I fear the answer is we’re able to do very little, because we live in a world where we let “profit-seeking corporations decide” a vast number of things in our lives, and it’s extremely difficult to carve out an exception (as in, “well, we’re gonna let corporations successfully lobby governments to cut corporate tax rates, water down environmental protections and decimate social safety nets – but we’re gonna reclaim the Internet for the people, goddammit!).

    So freeing ourselves of the enslavement (and I applaud your choice of words here) wrought by mega corporations like Google requires broad-based political action to help shape the future of the Internet (by those people living in at least nominal democracies that are permitted to take that action). As people in the face of such political challenges prove themselves time and time again to passive, self-interested and easily swayed by the marketing of those mega corporations (oh the contextual irony) I’m not optimistic. And even those principled and committed individuals who are willing to fight the good fight find themselves facing well-funded behemoths (guess which company spent the most of any tech firm on lobbying Washington last year – you probably know Barry, but I direct any of your curious readers to the company featured in your headline).

    Having said all of that Europeans can at least take solace in the fact that a net neutrality law is being drafted for the EU, which at least proves in principle that we can collectively shape the internet for the benefit of citizens, as per TBL’s original vision. But I still can’t summon up much optimism when it comes to the likelihood of political activism impacting the internet environment in the United States.

    I very much appreciate your balanced enumeration of both the good and the bad that results from Google’s “best practices” guidelines. In many if not most cases the lure of profit has actually resulted a search engine that serves most people exceptionally well: Google has done everything in its power to make sense of the content on the web and understand users’ needs so that it may produce more effective advertising (and in that way we all owe a small debt to PPC technology for funding and providing incentive to the task of indexing and organizing the world’s publicly-available digital information).

    On the flip side ultimately – as you note – when Google’s dictates don’t result in a “win-win” situation, the likeliest outcome is “Google wins.” In advertising, yes, we get the results like those you picture (though I can’t help but note that whether or not it’s Google or the other brands on top there has less to do with our vision of the ‘net than our vision of capitalism – that is, are we “content” to have mega corporations like Google or Amazon wipe out all competition, or do we legislatively carve out a role for smaller players? – cf. my related note on passivity and self-interest:). And Kafka would have a hard time bettering recent Google directives on outbound linking – though I wonder (since nofollow is, after all, a directive to search engines that other data consumers don’t have to respect) about the impact of these strictures outside the search ecosystem. Certainly nothing that has a chilling effect on linking can be good.

    Silver lining? If Google’s practices ever get you down, you can at least be grateful it ain’t Baidu!

    Anyway, thanks for such a thought-provoking piece Barry!

    Reply »

    • By on

      @Aaron, you paint a gloomy picture there, but I fear you may be 100% spot on. Having said that, as the old adage goes, all it takes for evil to triumph is for good people to do nothing.

      Reply »

      • By on

        As these things go, I don’t actually think the situation with Google is too grim.

        While still with their eyes on the monetary prize, their approach has always been to dominate their market space by building a superior product – rather than, say, by relying on marketing – and that approach has very obviously been successful.

        And whether or not for altruistic reasons or not they’ve been champions of open source solutions, and have taken the lead working collaboratively with other search engines. And they’re they only search engine to have taken anything like a principled stand in the face of Chinese government censorship of websites and search results.

        As for the proliferation of ads and Google-allied sponsored results in the search results, there’s no search marketer outrage that I am less sympathetic towards. Yank the organic results altogether and then I’d be upset, but advertising is the price we pay Google for indexing and organizing and making accessible the world’s digital information. The knock-on complaint that this pushes smaller organizations out of the SERPs isn’t really germane to what Google is doing – yet again that’s a function of the economic systems in which Google operates, not of a uniquely Google-esque approach.

        Finally, the experience of the EU is cause for hope, and a good object lesson that people can accomplish good things if through collective action. It’s hard to be optimistic about the effectiveness of political activity North America as we bear witness to the sale of democracy in the United States (“we” being the world – I’m in Canada), but it’s nonetheless encouraging to see some good things happening across the pond.

        Reply »

  7. By Steve on

    Andy, SSL is not much of an extra until you try to do things like online banking over a satellite connection. I was shocked at how staggeringly slow it was. I agree, security is important.

    But needless security becomes like security theater at the airport. People develop a false sense of security because “everything” is secure and they stop taking responsibility for protecting themselves.

    Reply »

  8. By on

    “Nowadays webmasters are encouraged to nofollow every link they don’t personally vouch for, and can see their sites penalised if they don’t.”

    That’s not the full story. Webmasters are now encouraged [bullied] to no-follow every link that smells like it could have been paid for and/or has as an anchor that looks like a keyphrase.

    Example 1:
    I have on one of my personal blogs a couple of do-follow anchor text links in the sidebar that all go to friend’s websites. One of them is a link to my in-laws’ B&B in France. To Google this looks like a classic paid link. But it’s not. Now, in the ‘new Google world’ I have already changed this anchor text slightly to have it not just saying ‘B&B Sarlat, France’ but ‘My in-law’s B&B IN SARLAT, FRANCE’. This is already ridiculous, that Google is making me do this. I seriously even thought for a second about no-following it. But then I of course thought ‘fuck that, it’s ridiculous’.

    Example 2:
    Client calls me up and tells me that a friend has a relevant website and wants to feature an infographic that we have done for them because he really likes it and it would be great info for visitors to his website. Friend asks me if he can link back to the client’s website and I tell him to only use brand anchor text, not any relevant words. I basically told him to compromise user experience for the sake of making sure that Google can’t moan about that link. But this link is still looking ‘shady’ to Google and to be ‘safe’ I should actually ask to have it no-followed. Ridiculous.

    No-follow was introduced by Google so you could write a post about how crap something on a webpage is and link to it without having that link being recognised as a positive signal. Clever.

    Then Google realised that they actually can’t properly tell a bought/engineered link from a link given by merit and from then on the whole ship went down, dragging everybody with it, collateral damage left right and centre.

    Having said all of this, it’s their playground and if you want to play with them on their field you have to play by their rules or be prepared to get sent off the field. I accept that as I can’t change it. I am very tired of pushy/lazy SEOs and spammers moaning about getting penalised. Play with fire in a stupid way & you get burned. Google’s mission statement is ‘organising the world’s information’ and not ‘ranking your commercial website #1′. And they are better at that as anyone out there. Fact. Have you recently looked at Yandex search results? They are crap.

    The main question is ‘how can global web search be in the hands of a corporate turbo-capitalist company’? How the fuck did that happen? All of us let that happen. So we have to blame ourselves.

    Reply »

  9. By on

    ps: Barry, as you no-follow my comment signature link, does that mean that you don’t personally vouch for me although you know me and personally criticize the whole no-follow issue?

    The whole issue in a nutshell.

    Reply »

    • By on

      @Ingo: fair comment, and thanks for reminding me that WordPress automatically nofollows blog comments. Just installed a plugin that rectifies that – I keep my comment threads tight anyway, so no reason to nofollow comment links.

      Reply »

      • By Ingo Bousa on

        Prompt reaction! You’re on the ball Barry : )

        So, that’s probably more work for you now because you need to moderate the blog comments. But that’s what you would have done in the first place anyway, no? I have seen most blogs putting no-follow as a default on their comment sections and I think it’s nothing but a lazy shame.

        Reply »

Trackbacks

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Powered by sweet Captcha

Back to top ▴