top of page

151 items found for ""

  • Full Steam Ahead Until 2016

    To say I’m  busy man would be a bit of an understatement. I’ve signed up a few more new clients recently which keep me very busy indeed, and on top of that I’ll be attending and speaking at a host of events the next few months. First there’s the big one: Pubcon in Las Vegas from October 5th to 8th. I was honoured to be accepted as a speaker and will be part of two sessions: a technical SEO session with Dave Rohrer and Michael Gray, and a live site review session with Derek Wheeler, Russ Jones, Kevin Lee, and Greg Boser. I’ve always wanted to go to Pubcon, and to go as a speaker is, well, pretty awesome for me. The wife and I are taking the opportunity to do a wee bit of sightseeing as well, but I can’t hang around too long as on October 15th I’ll be speaking at the Digital DNA conference in Belfast. There I’ll be part of a workshop together with Mark Haslam from Loudmouth Media. Mark will espouse the virtues of PPC advertising, while I will be waving the banner for SEO. Once the dust has settled from that one, I’m heading to Dublin to attend the next Learn Inbound event on October 21st, which will feature a truly awesome lineup including Rand Fishkin, Wil Reynolds, Gianluca Fiorelli, and my good friend Bas van den Beld. I’m positively giddy about that one – such a superb lineup, right here on the Emerald Isle! Recovering from the inevitable hangover will take a wee while, so I’m happy that my next event isn’t until November 12th, when I’ll be delivering a SEO workshop for the European Cancer Leagues and hopefully contribute a bit to making cancer charities more successful online. The following week I’m heading to Milan, which will be my first trip to Italy. Alas, the tourist sightseeing will have to wait as I’m there to present at Search Marketing Connect 2015, delivering a workshop on technical SEO. Joining me there are some of Europe’s top SEO speakers, including Jan-Willem Bobbink, Alex Moss, Aleyda Solis, and Gianluca Fiorelli, to name but a few. Closing the month will be TEDx Omagh back in Northern Ireland, which boasts a pretty interesting lineup including one of the SEO industry’s more notorious troublemakers, Josh Bachynski. That ought to be a fun event to attend. In between all of that I’ll be working on all my client projects – it is easy to forget that’s what it’s all about; delivering value for clients through effective SEO. The events are great, but the client work is what pays the bills so that must always come first. I think I’ve earned a wee bit of rest and holiday time in December, so no more events until 2016. For now, at least. Who knows what’s around the corner?

  • In memory of Dana Lookadoo

    Getting up this morning, I’d intended to write about tonight’s 2015 DANI Awards, which I felt privileged to judge and be a part of. I’d also intended to write about this blog’s shortlisting for the Blog Awards Ireland 2015 and encourage public voting. But all of that flew right out the window when I heard of the passing of Dana Lookadoo. I can’t lay claim to any sort of close relationship with Dana. Like many in the SEO industry, I knew her only from afar. I first got to know her through the SEO Training Dojo where we discussed SEO and all things digital. Dana would be especially active on our shared Skype group chat, where her wit and intellect brightened up every day. We emailed a few times and chatted on social media about SEO and related matters, but no more. Yet I felt, like so many others, that she was a powerful and bright presence in our industry and community. When Dana suffered a severe cycling accident in 2013, we all held our breath. But she survived, and began a long and tortuous road to recovery. She found solace in her faith during this time, something which is entirely alien to me, but it gave her strength and courage and for that I can only be grateful. A private Facebook group was created to keep up to date on Dana’s progress, offer support and advice, and provide financial aid where possible. I’d like to think Dana drew a lot of strength and encouragement from that group, from our continued messages of support and love, showing how much we all cared about her. Unfortunately Dana’s journey of recovery was full of pain and countless setbacks, and when her trickle of Facebook postings faded to an almost complete silence, many feared the worst. Though we still hoped, and prayed – each in our own way – that she might bounce back and emerge stronger than ever. Alas, that was not meant to be. Dana Lookadoo passed away on Monday, and yesterday her husband Ed broke the news on the private Facebook group. The SEO industry mourns her today. All who knew her, even from afar, will miss her presence greatly. There is talk of a scholarship to be set up in her name, and when I’ve details I’ll update with ways to donate to it. Rest peacefully, Dana. Your pain has gone away, but ours has just begun. The world is slightly less bright without you in it. Update: the website for the Dana Lookadoo Memorial Scholarship Fund is now live. Donation details can be found there.

  • Belfast Bloggers Meetup

    Last week a fresh Belfast Bloggers Meetup was held at Farset Labs. I’d been to the very first meetup, where I spoke (i.e. ranted) about SEO for bloggers, but I hadn’t managed to attend one since. This time I was asked to speak on a topic of my choosing, and I decided to make it a bit of a mix. Before my talk, two other speakers held the stage, starting with Brian O’Neill from the famous Northern Irish political blog Slugger O’Toole. As the technical guy behind the scenes, Brian shared fascinating insights about how such a popular blog is run and managed, how they moderate their comments, and the hosting requirements for their website (turns out it still runs on a shared server). Next up was Brian John Spencer, a man I’ve admired for a long time and finally got a chance to meet. A political cartoonist, blogger, and writer, Brian spoke about his awakening to blogging as a platform to make his voice heard and write about his political views and ideals, not content to let others speak on his behalf with their warped visions of what his culture should look like. Brian is an exceptionally well-read man, which was evident from the litany of quotes he peppered his talk with, from John Hewitt to George Orwell, Christopher Hitchens and Daniel Dennett. His talk was hugely inspiring and motivating for me to put even more passion and drive in to my writing. He also drew one of his signature caricatures of me while I was speaking – it’s awesome! Lastly I was up, and my talk went from the negatives of Google’s mission statement to monetising blogs by selling links, and I fielded a number of questions from the attendees as well who were keen to learn more about SEO and blogging. Before the event some of the speakers and attendees were interviewed by Northern Visions TV, a local TV station, and the resulting video has been made available online. It includes some snippets from my interview as well: The quality and participation of the bloggers meetup seems to grow with every event, so I’ll make it a point to attend every one from now on. You can keep up to date with the meetup by following @BelfastBloggers on twitter.

  • Protect your Staging Environments

    A lot of web design agencies use online staging environments, where a development version of a website resides for clients to view and comment on. Some agencies use their own domain to host staging environments, usually on a subdomain like There is a risk involved with online staging environments: Google could crawl & index these subdomains and show them in its search results. This is a Bad Thing, as often these staging websites contain unfinished designs and incomplete content. Public access to these staging websites could even damage a business if it leads to premature exposure of a new campaign or business decision, and could get you in to legal trouble. Today, whilst keeping tabs on some competitors of mine, I came across this exact scenario: The name has been redacted to protect the guilty – I’ve sent them an email to notify them of this problem, because I want to make sure their clients are protected. A business shouldn’t suffer because of an error made by their web agency. How To Protect Your Staging Sites Protecting these staging environments is pretty simple, so there really isn’t an excuse to get it wrong. Robots.txt blocking For starters, all your staging environments should block search engine access in their robots.txt file: User-agent: * Disallow: / This ensures that the staging website will not be crawled by search engines. However, it doesn’t mean the site won’t appear in Google’s search results; if someone links to the staging site, and that link is crawled, the site could still appear in search results. So you need to add extra layers of protection. You could use the ‘noindex’ directive in your robots.txt as well: User-agent: * Disallow: / Noindex: / This directive basically means Google will not only be unable to access the site, it’ll also not be allowed to include any page in its index – even if someone else links to it. Unfortunately the ‘noindex’ directive isn’t 100% fullproof; Tests have shown that Google doesn’t always comply with it. Still, it won’t hurt to include it in your robots.txt file. Htaccess login The next step I recommend is to put a password on it. This is easily done on Apache servers with an .htaccess login. Edit the staging site’s .htaccess file (or, if there isn’t one, create it first) and put the following text in the file: AuthType Basic AuthName "Protected Area" AuthUserFile /path/to/.htpasswd Require valid-user Then create a .htpasswd file in the path you’ve specified in the .htaccess file. The .htpasswd file contains the username(s) and password(s) that allow you to access the secured staging site, in the [username]:[password] format. For example: john:4ccEss123 However you probably want to encrypt the password for extra security, so that it can’t be read in plain-text. A tool like the htpasswd generator will allow you to create encrypted passwords to include in your .htpasswd file: john:$apr1$jRiw/29M$a4r3bNJbrMpPhtVQWeVu30 When someone wants to access the staging site, a username and password popup will appear: This will make your staging environment much more secure and will prevent unauthorised access. IP address restriction Lastly, as an additional layer of protection, you can restrict access to your staging sites to specific IP addresses. By limiting access to the staging sites to users coming from specific networks, such as your own internal office network and the client’s network, you can really nail the security down and make it impervious to access for all but the most determined crackers. First of all you’ll want to know your office’s internal IP address as well as that of your client’s. This is pretty simple – you can just Google ‘what is my ip address’ and it’ll be shown straight in search results: Have your client do the same and get their office’s IP address from them. Check that you’re both using fixed IP addresses, though – if you’re on a dynamic IP address, yours could change and you’d lose access. Check with your internet service provider to make sure. Once you’ve got the IP addresses that are allowed access, you need to edit the staging website’s .htaccess file again. Simply add the following text to the .htaccess file: order allow,deny allow from 123.456.789.012 deny from all This directive means that your webserver will allow access to the site for the specified IP addresses (and you can have as many as you want there, one per line) and deny access to everyone else. With those three security measures in place, your staging environments won’t be so easily found any more – and certainly not with a simple ‘site:’ command. How To Remove Staging Sites from Google’s Index Once you’ve secured your staging environments, you’ll also want to remove any staging websites from Google’s search results in case they already show up. There are several ways of doing this: Use Google’s URL removal tool In Google Search Console (formerly known as Webmaster Tools) you can manually enter specific URLs that you want removed from Google’s search index: Simply create a new removal request, enter the URL you want deleted from Google’s index, and submit it. Usually these requests are processed after a few days, though I’ve seen them handled within a few hours of submitting them. The downside of the URL removal tool is that you need to do it manually for every URL you want deleted. If entire staging sites are in Google’s index, this can be a very cumbersome process. Noindex meta robots tag Another way to get pages out of Google’s index is to include a so-called meta robots tag with the ‘noindex’ value in the HTML code of every page on your staging site. This meta tag is specifically intended for crawlers and can provide instructions on how search engines should handle the page. With the following meta robots tag you instruct all search engines to remove the page from their indeces and not show it in search results, even if other sites link to it: When Google next crawls the staging site, it’ll see the ‘noindex’ tag and remove the page from its index. Note that this will only work if you have not blocked access in your robots.txt file – Google can’t see and act on the noindex tag if it can’t re-crawl the site. X-Robots-Tag HTTP Header Instead of adding the meta robots tag to your website – and running the risk of forgetting to remove it when you push the site live – you can also use the X-Robots-Tag HTTP header to send a signal to Google that you don’t want the site indexed. The X-Robots-Tag header is a specific HTTP header that your website can send to bots like Googlebot, providing instructions on how the bot is allowed to interact with the site. Again you can use the Apache .htaccess file to configure the X-Robots-Tag. With the following rule you can prevent Google from crawling and indexing your staging site: Header set X-Robots-Tag "noindex, nofollow" With this rule, your Apache webserver will serve the ‘noindex,nofollow’ HTTP header to all bots that visit the site. By having this .htaccess rule active on your staging site, but not on your live site, you can prevent your staging websites from being crawled and indexed. Note that, like the meta noindex tag, the X-Robots-Tag header only works if bots are not blocked from accessing the site in the first place through robots.txt. 410 Gone status code Finally, another approach is to serve a 410 HTTP status code. This code tells search engines like Google that the document is not there anymore, and that there is no alternative version so it should be removed from Google’s index. The way to do this is to create a directive in your .htaccess file that detects the Googlebot user-agent, and serves a 410 status code. RewriteEngine On RewriteCond %{HTTP_USER_AGENT} googlebot [NC] RewriteRule .* - [R=410,L] This detects the Googlebot user-agent, and will serve it a 410 HTTP status code. Note that this also will only work if there’s no robots.txt blocking in place, as Google won’t quickly remove pages from its index if it doesn’t find the new 410 status code when trying to crawl them. So you might want to move the staging site to a different subdomain and secure it, then serve a 410 on the old subdomain. The 410 solution is a bit overkill, as Google will remove a page from its index after a few weeks if it keeps getting a 401 Access Denied error and/or if it’s blocked in robots.txt, but it’s probably worth doing just to get the site out of Google’s index as soon as possible. Security Through Obscurity = Fail In summary, don’t rely on people not knowing about your staging servers to keep them safe. Be pro-active in securing your clients’ information, and block access to your staging sites for everyone except those who need it. These days, you simply can’t be safe enough. Determined crackers will always find a way in, but with these security measures in place you’ll definitely discourage the amateurs and script kiddies, and will prevent possible PR gaffes that might emerge from a simple ‘site:’ search in Google.

  • The Slow, Agonising Death of Google+

    You have to feel sorry for the engineers behind Google+. When it was first launched, many people (myself included) felt that Google had finally mastered social media. Here was a robust social platform that copied many popular features from public favourites Facebook and Twitter and added a range of newfangled goodies like Hangouts and Circles that made it more attractive. And on the surface of things, it all seemed to go well for Google+. The launch was a global news event, and initial growth numbers were amazing (before we found out that most of those figures were utterly farcical). The digital community embraced Google+ wholeheartedly, and because all of our colleagues and peers were active G+ users we felt it would just be a matter of time before our non-digital friends and family would also embrace Google’s social platform. That didn’t happen, of course. As the hype died down, we soon began to realise that the general public stayed away from Google+ almost entirely. Google kept hinting at usage figures that indicated Google+ should be bigger than Twitter, and some studies claimed it could even outstrip Facebook. But in our hearts we knew this was all a lie. Most of our friends and family outside the industry were on Facebook and some even on Twitter. But almost none were active on Google+, despite the stellar numbers emerging from Mountain View. Something was amiss. The fanfare with which Google+ was announced soon died down to a soft murmur of sporadic tweaks and increasingly ridiculous user figures that seemed plucked from thin air. Then the mistakes began to pile up, from the initial error on insisting on real names and, later on, the colossal mistake of forcing all YouTube accounts on to Google+. After this, things got quiet. Very quiet. We didn’t hear much about Google+, and the platform wasn’t being developed further. The first suspicions that it was destined for an early grave began to emerge. The sudden departure of Google+ chief Vic Gundotra was surely a sign of the platform’s impending doom. Since then, the nails continued to be hammered in to Google+’s coffin with alarming frequency: from the relevation of the platform’s disastrous usage numbers to the spinoffs of Photos, Streams and Hangouts in to separate apps, the deletion of Google+ Author snippets from search results, the notable absence of Google+ mentions at official Google events, and the lower priority of the Plus button. Yet where Google has previously shown remarkably little restraint in killing off some of its failed children (Buzz, Wave, Reader, to name but a few), Google+ has yet to face the official axe. Despite the increasing obviousness of the platform’s imminent demise, Google has refused to pull the plug. It’s still there, a slowly decomposing zombie, a husk of a platform that serves as a continuous reminder of the potential it failed to live up to. It is a long, drawn-out death that must surely be painful for the engineers that birthed the platform and those that continue to work on it today. When will Google finally show mercy to its decaying offspring and put Google+ out of its misery?

  • Google is under attack, Search Engine Land to the rescue

    Yesterday the Wall Street Journal published a piece of investigative journalism [paywall] about the US’s Federal Trade Commission’s case against Google, which petered out with barely a sizzle in 2013. That’s what being the largest corporate lobbyist in Washington DC gets you. According to a leaked document, many people inside the FTC found Google to be engaging in fierce anticompetitive behaviour, and wanted to pursue the case further. Some examples of Google’s nefarious activities uncovered by the Wall Street Journal include blacklisting of competing websites, favouring of its own properties (well, duh), and illegally restrictive contract policies. The image below, from WSJ’s Twitter, illustrates some key elements where FTC staff found Google in breach of the law, but where the eventual settlement with Google failed to act decisively: Now that it’s revealed the FTC’s case against Google probably should have gone much further and the search engine was let off very lightly indeed, there is a strong case to be made for more far-reaching litigation against Google in Europe. But Google needn’t worry, because the waters are already being muddled by Google’s own propaganda machine, primarily in the form of its biggest cheerleading blog Search Engine Land and its sister blog Marketing Land. Greg Sterling’s initial piece on Search Engine Land starts casting doubt on the importance of the leaked FTC document straight in its subheader: The rest of the piece is fairly toothless, happily emphasising that the FTC refused to litigate against Google and instead settled the case. Unsurprisingly there’s no mentioning of the manifold objections against that settlement from various different parties, nor of Google’s abundant lobbying efforts in the nation’s capital. But Greg does make a point of quoting Google’s chief counsel, once again iterating the FTC decided not to pursue. Apparently thinking that Greg’s initial piece wasn’t pro-Google enough, Danny Sullivan then publishes a more in-depth piece on Marketing Land. The main headline starts encouragingly: But quickly Danny takes on his favourite role of Google defender and starts casting doubt on almost every aspect of the Wall Street Journal’s piece and the FTC document. In the process Danny tellingly reveals that he does not understand how antitrust investigations work, as he repeatedly says that what Google did was also being done by other search engines. Anyone with even a casual understanding of antitrust law will realise that this is entirely irrelevant: the rules change when you become a monopoly, which Google definitely is – even Eric Schmidt has had to admit that. What makes for acceptable (if immoral) competitive behaviour in a more egalitarian marketplace, becomes illegal under antitrust law when you’re a monopoly. In all fairness, Danny probably understands this but still feels it important to point out that “Google wasn’t doing anything that rivals weren’t also doing”, thus casting unwarranted doubt on the FTC staff’s conclusions. Danny then goes on to link to and quote liberally from earlier posts he wrote about Google, all with his favoured pro-Google slant of course, and then adds several post-scripts to further clarify Google’s defense and make abundantly clear that no, really, the WSJ piece’s most damning evidence was just part of a ‘test’. He concludes by liberally paraphrasing Google’s hastily penned PR spin. I have no doubt that when Google’s more polished official press release on this matter is released, probably in the course of today, Marketing Land and/or Search Engine Land will publish it almost entirely and make a big fuss of how it disproves the accusations made in the WSJ article and FTC document. Fortunately Search Engine Land and Marketing Land are just enthusiasts’ blogs rather than proper news organisations, so we can hope that few policymakers will actually read their distorted propaganda. But the SEO industry will lap it all up, as they’ve always done, which can help set the tone for future debates on this issue. I’d advise everyone not to rely on a single blog or news site to inform your opinions. Read multiple viewpoints from different trustworthy sources and make up your own damn mind.

  • Friends of Search 2015

    Last week I had the pleasure of attending the second Friends of Search conference in Amsterdam. I was a speaker at last year’s inaugural event, and I liked it so much I promised I’d be back as an attendee for the next edition. As it turned out, the organisers asked me back as a panelist anyway! The evening before the event most of the speakers and organisers got together and shared a few drinks & nibbles, and this is when I finally met Michael King aka iPullRank. Mike and I have had our share of disagreements over the years, and this seemed a good opportunity to put all that behind us and instead bond over beers and cheesy selfies. Kicking it with @badams — MyCool King (@iPullRank) February 18, 2015 The next day Michael kicked the conference off with his keynote about where SEO is heading, and how to get ahead of the game. His key takeaway: don’t focus on catching up with Google, instead focus on maximising your value for the user. That’s where Google is heading, so you need to already be there when the search engine gets clever enough to properly enforce its user-centric approach. Following Michael was a presentation from a Google guy who spoke, unsurprisingly, about ads. Once again Google’s effective use of double-speak and propaganda tactics was made clear in how the Googler equated ‘search’ with ‘advertising’, never once mentioning organic search. It’s a subtle game Google is playing, but it sure is effective. All in all that talk was entirely forgettable and probably the least interesting of the whole day. That one blip was more than made up for by Richard Baxter‘s awesome talk about ‘the aggregation of marginal gains’. He made an analogy between SEO and Formula 1, where small improvements made along the way result in a massive uplift at the end. Richard listed numerous small SEO tweaks you can use to squeeze every bit of performance out of a website. Next up was Cindy Krum, who’s title of ‘mobile marketing evangelist’ is entirely appropriate. Her talk about mobile SEO had a whole load of useful stats and actionable tips, including one I’d not heard before: using the Vary HTTP header for dynamically served content. Mobile SEO ranking factors from @Suzzicks at #FOS15: — Barry Adams (@badams) February 19, 2015 Another stand-out talk for me was Ian Lurie‘s presentation about strategic thinking for SEO. So many SEOs believe they have a strategy for their clients, but all they really have is a collection of tactics. Ian clarified what a strategy actually is and how you go about formulating one. With the right strategy in place, success in SEO is much more likely and you can avoid many of the pitfalls that befall those who just follow the latest hypes. Another talk later on the day that I really enjoyed was Pascal Fantou‘s ‘tweak geeks’, where he showed a number of technical tricks and Google Analytics hacks that can really wreak havoc with your competitors. I wouldn’t necessarily recommend using these hacks, of course, but they’re very interesting to test for, ahum, research purposes. Mwhahahaha. The superb Lisa Myers gave a strong talk about creative campaigns, showing a number of examples of successful linkbait using creative thinking and fresh angles. This was a very valuable talk for me, as so many speakers only dive in to the theory – Lisa showed actual real-world case studies, demonstrating the true effectiveness of what her agency does. At the end of the day we had a panel discussion where I got to contribute modestly to the day, answering questions from our moderator Bas van den Beld as well as from the audience together with Ian Lurie and Lisa Myers. I had at least one noteworthy contribution to that panel, best summarised by Dennis Sievers’s tweet: Best SEO tool? Common fucking sense, according to @badams #fos15 — Dennis Sievers (@resiever) February 19, 2015 Afterwards the evening’s proceedings began with free beers at the venue’s bar. Suffice to say it was another late one, with special thanks to David, Ruud, and Jan-Willem. Friends of Search 2016 is definitely on my agenda once again – if it’s half as good as the 2015 edition, it’ll be worth every penny.

  • Learn Inbound January 2015; an Awesome Start

    Believe it or not, yesterday I found myself attending an ‘inbound marketing’ event. If you know me, you’ll know I despise that phrase and have an almost allergic reaction to it. Yet there I was, an audience member at an event organised by the Learn Inbound community of marketers in Ireland. Headed up by HubSpot’s Siobhán McGinty and the Digital Marketing Institute’s Mark Scully, Learn Inbound held their inaugural event in Dublin yesterday evening, and I made the trek down to attend it. This was not done on a whim – it takes quite a lot to overcome my innate revulsion for ‘inbound marketing’. But I’ve been following Mark Scully on Twitter for a while and find his views and content to be outstanding. And then there was the speaking roster. For a very first event, Siobhán and Mark sure managed to get some great speakers: Matthew Barby, one of the digital industry’s brightest young talents, Stephen Kenwright, head of search at Branded3, and Aleyda Solis, who really needs no introduction. Matthew kicked things off with a great talk about effective content marketing for SMEs, running us through his process. He showed us a range of tools as well, and it wouldn’t be the last time that evening that BuzzSumo got mentioned. What I especially liked about Matthew’s talk was when he talked about building a content delivery team. So many organisations go through the motions of developing a content marketing plan, but then fail to deliver on it. Matthew’s talk didn’t shy away from that and he managed to show us the value of putting a delivery team in place. Next up was Stephen Kenwright, a well-known figure in the industry, who gave us an entertaining and highly insightful talk about how your content marketing should align with your SEO efforts. Stephen almost didn’t get to the event, courtesy of endless flight delays, but after nine hours at the airport his flight managed to take off just in time for him to make it. Some of his key points were not to chase after keywords, but after the consumer – i.e. don’t let keywords dictate your SEO, but distil the actual search intent from your keyword research and let that drive your content. Using the Google Search Quality Guidelines as his touchstones, Stephen showed how focusing on user intent and tapping in to your expertise allows small businesses to compete with big organisations in SEO. Last but definitely not least was Aleyda Solis, who gave a great talk about SEO for small businesses and showcased a whole range of tools and tactics that SMBs can leverage to improve their visibility in search. She too demonstrated that you don’t need huge budgets to win in search, but you do need to be smart and use all the opportunities at your disposal. I especially liked her tip on agile marketing: being able to outmanoeuvre big companies by adopting an agile and iterative approach to your marketing efforts. Big businesses can rarely make quick changes to their website, so as a small business you can respond much quicker and test things out at a vastly greater speed. That gives you a competitive advantage over the big boys. The event concluded with a great Q&A where the attendees got the opportunity to ask some in-depth questions to the panel. Then we were rushed out of the building by an angry security guard who was miffed the event had overrun a bit. I really enjoyed the event and it was great to catch up with Aleyda, as well as meet Mark & Siobhán, Matthew, and Stephen. I’ve been to paid conferences that were less well organised and informative. There are three more Learn Inbound events in the planning for 2015, with the next one scheduled for April 15th. The speakers for it are already announced, and none other than Richard Baxter is on the panel. I’m definitely not missing the opportunity to heckle him.

  • What Every Web Developer Should Know About SEO

    The problem with SEO is that it is often controlled by marketers. Marketers aren’t inherently bad people, but when you get a bad one (of which there are many) any information you receive around SEO is going to be filled with buzzwords and soft outcomes. From a development point of view SEO is the concern of how well a robot can read and understand your content. As we will see, a robot being able to read your content easily is normally a good thing for humans too. The following sections are going to explain several topics that are clearly within the developer’s remit and a good understanding of their impact for both humans and robots will help in any project you work on. Site Speed How fast your site loads and is perceived to have loaded is a highly technical challenge. Assets need to be as small as possible for transmission and maintain a high quality. You should care about how many network requests are being made per page load. You need to care about perceived page load, so getting content onto the screen as quickly as possible. The order things come down the network at is important. A global internet means not everyone is accessing your site on a broadband connection. Mobile internet means you can’t guarantee the transmission of data will even complete if it takes several cycles. Why Site Speed is good for SEO Site speed has been listed as one of Google’s ranking factors. Naturally the faster the site the higher potential score you will get for this one part of their algorithm. According to Moz’s breakdown of website speed and ranking the key factor is the time it takes for the first byte of information to come across the pipes. If a search engine’s crawlers can download the contents of your page quickly it is going to do it more often than if it takes seconds per request. When people are researching for an article they are writing, they are more likely to stick around and read a page that responded quickly. This means your content is being absorbed by more people and has a greater chance to be linked to by someone. Why we should care about Site Speed anyway Even if you don’t care about SEO you can’t argue that slower is better, there are several studies showing that faster page loads are better for everyone. Take this KissMetrics writeup for example. Slow speeds can be an indicator that there is a query that is taking too long or a memory leak happening somewhere, if so your site may not be using the resources on your server efficiently and you may be spending money on a package you don’t actually need. Redirects Redirects are the hoops that your server jumps through when a browser asks for a page at a particular URL but knows it lives at a different location. There are several things that need to be considered: Over the lifetime of your website potentially thousands of other sites will link to pages that you had long since forgotten about. You can do redirects at various levels, each one comes with maintainability issues. If done wrong can have a negative effect on your site. Can be broken for months before someone notices. Each redirect has an implied latency. Why Redirect are good for SEO Search engines like there to be one canonical place for everything, so if you have two paths that lead to the same content this is confusing for them. If instead you say that anytime someone types and you change it to automatically then the search engine doesn’t have to worry about several places. This comes into play heavily when content moves completely, perhaps between domains. Doing redirection well ensures that any past page authority is transferred to their new home. Why we should care about Redirects anyway Nobody likes dead links, this can easily happen when something major about the structure of your site changes (domain name, internal structure). If a user goes to your site and gets a 404 they are not going to try subtle variations of the URL in order to get to the content, they will go onto the next site. Even if the link isn’t dead, people don’t like jumping between 5 different URLs before getting to the content. If done poorly this can result in multiple network requests which is inefficient. Status Codes Status Codes are the codes returned from your server after a request has been made, as a developer you need to make sure you are returning the correct code at any given moment. If you return a status code of 500 but meaningful content still is returned, will a search engine index it? Will other services? Search engines care a lot about the 3xx redirection status codes. If you have used a CMS to build your site it sometimes isn’t apparent what codes are being used where. Why Status Codes are good for SEO The status code returned is one of the primary things a search engine has to know what to do next. If it gets a 3xx redirect notice it knows it needs to follow that path, if it gets a 200 it knows the page has been returned fine, etc. Making sure all your content is returning on the 200 code and all your redirects are appropriately using the 301 code means search engines will be able to efficiently spider and rank your content. Why we should care about Status Codes anyway We should care about status codes anyway because search engines are not the only thing that might care about the content on your site; browsers, plugins, other sites (if you have built an API) all could potentially care about what code is returned. They will behave in ways you might not expect if you return invalid or incorrect codes. Semantic Markup Semantic Markup is markup that has inherent meaning associated with it, a simple example would be to know that the element is going to be the overarching heading for the section you are in. There are some very subtle things that should be considered when choosing markup When content should use elements like , , , , etc. When does it make sense to additionally use semantic attributes, for example those suggested by Be prepared to make CSS changes to accommodate the default styles, remember there is a difference between design and function. Don’t just use elements like because you can in place of a . You have to realise that all elements come with an inherent semantic value (even if that is to state “I have no semantic value”). Why Semantic Markup is good for SEO Semantic Markup is excellent for SEO because you are literally giving the content on your page meaning that a search engine can easily understand. When you use the suggestions for a review, search engines will know that when you say 3/5 at the end that what you mean is you have scored it 3 out of 5 and will potentially show those amount of stars on their search result page. Semantic markup lets you group and link content. The old way of thinking was that a page could have one element, and that was normally reserved for the name of the site. Now because of the likes of and we can have grouping that make sense. This means search engines can have a much easier time of parsing longer articles. Why we should care about Semantic Markup anyway We should care about this anyway because search engines are not the only things looking at our site. Assistive technologies such as screen readers can use semantically marked up documents a lot easier. For example, when you markup content with an element some assistive technologies know to leave that out of the main content when reading aloud to a visually impaired user. Maybe your user can’t concentrate on large articles with lots of information. By semantically breaking down this information they can clip what they need to view how they like to view things. Search engines aren’t the only robots out there looking at your site. Other services could hit your site and look for the likes of a CV, if you have used the correct markup and semantics that would be an easy task. URL Structures URL Structures are what you see when you look in the address bar, so they could be something like or they could be Getting these structures right requires some thought and some technical knowledge. Do I want to have a deep structure like Are the structures consistent across my site. Are the structures meaningful to anything but the site’s code. Is there a logic to them that a new developer could follow and add to. Why URL Structures are good for SEO A good URL structure is good for SEO because it is used as part of the ranking algorithm on most search engines. If you want a page to rank for “purple beans” and your URL is then search engines will see that as a good sign that the page is going to be dedicated to the discussion of purple beans. The URL will appear in search results, if it makes sense people are more likely to click on it than if it is a jumble of IDs and keywords. A good URL will serve as its own anchor text. When people share the link often they will just dump it out onto the page, if the structure makes sense it will allow your page to rank for those terms even without someone setting it up correctly. Why we should care about URL Structures anyway Outside of the context of search engines, we encounter URLs all the time and as users of the web we appreciate it when things make it simple. Your users will appreciate it when they can look at a URL coming off your site that just makes sense, if they can look at a URL and remember why they have it in a list without needing to click into it, that is a big win. Over the lifetime of a website you will be surprised how much of your own admin you will need to do that will involve you looking at the structure of the URLs. If you have taken the time to do them right it will make your life much easier. A note about JavaScript and SEO I wanted to end by mentioning JavaScript very briefly. A lot of websites you will create are going to be JavaScript driven or at least rely on it very heavily. There are various schools of thought on if this is a good thing or not but the fact is JavaScript has happened! It used to be that search engines couldn’t even follow a link that was using a JavaScript onClick function, they have come a long way since then and can do an excellent job of ranking sites that are completely made in JavaScript. That being said search engines are not perfect at this task yet so the current advice still has to be that if you want something to be seen by search engines then you should try and make sure there are as few things blocking them by seeing it as possible. his blog and on Twitter.

  • My Predictions for Organic Search in 2015 and Beyond

    Recently I was asked by Danny Denhard to contribute to a collective post on the direction organic search will be heading in 2015. My thoughts were published alongside those of leading SEO industry experts like Stacey MacNaught, Kevin Gibbons, Dan Sharp, Paddy Moogan, Matt Beswick, Paul Rogers, Patrick Hathaway, Stephen Kenwright, Carl Hendy, Simon Penson, Justin Butcher, and Michael Briggs. It’s a great collection of different perspectives on the future of search, and you can see some areas of overlap where many of us believe search and SEO are heading. Every individual contribution is worthwhile to read. Below are my own thoughts, though I recommend you read the whole piece (it’s also available as a downloadable PDF). If there’s one complaint I can make, it’s that the contributions to Danny’s article are very heavy on XY chromosomes. As Stacey’s contribution is so excellent, I can’t help but feel that a greater dose of female perspective would have made it an even more awesome article. I’d love to have read the predictions of Aleyda Solis, Nichola Stott, Samantha Noble, Hannah Smith, or any of the other exceptionally talented women in our industry. Anyway, here is my contribution: With the number of high profile changes and algorithm updates in 2014, what do you think the most important thing will be to achieve success in 2015? First of all you need to have a product or service that Google can’t (easily) steal or copy. Increasingly we see Google wanting to be the end-destination, rather than the gateway, which is why they’re using websites’ content to provide answers directly in their search results (knowledge graph), or building rival services of their own (Google Maps, Flights, Hotel Finder, Credit Card comparison, etc). So make sure you have something that Google can’t (yet) take over, otherwise you’ll find yourself out of business very quickly. Secondly, you need to diversify your acquisition channels. Even if you have a unique product, there’s no guarantee Google will show you in its search results. Organic search is likely to be your biggest traffic driver for the foreseeable future, but if Google can’t take over what you do they’ll damn sure try to force you to buy AdWords ads. So don’t rely too much on a single traffic source. Diversify your digital marketing channels, and do them all as well as you can. Where or what do you think the biggest challenge will be in 2015? Aside from preventing yourself from becoming obsolete when Google tries to move in to your niche, your biggest challenge is to do effective SEO whilst steering clear of Google penalties and algorithmic filters. There has always been a conflict between what works to drive organic visiblity, and what Google recommends you do. In recent years this conflict ignited into full-on war when Google started putting the onus of cleaning up its search results firmly on webmasters rather than on its own webspam team, and they’ve been liberally handing out penalties ever since. The problem is of course that effective SEO often breaks Google’s guidelines. So, to avoid getting penalised, you have to be smart about what you do, how you do it, and the tracks that you leave for Google to discover. That’ll be your biggest challenge. I also expect Google will keep shifting the goalposts, and tactics that work fine now will be re-designated as spam at some point down the line. Preparing for that can also be quite challenging, and you’ll need contingency plans for when that happens. Thinking about how you think your industry or clients industries are going, what’s the best piece of advice that you give all clients or prospective clients in coming weeks/months? As per the first point, diversification is key. I’m a SEO guy through and through, and I’m advising my clients to not rely purely on SEO. Yes, organic search is and will remain the strongest driver of growth for nearly all websites out there, but due to the increasingly adversarial attitude Google is adopting towards the web, you can’t rely on organic search indefinitely. Google wants you to buy ads, so you better suck it up and start an AdWords campaign. When those paid visits arrive on your website, work hard to convert them into customers; use CRO and UX to make your website deliver tangible results for your business. And when you’ve won a customer, do whatever you can to keep them: use email marketing and social media effectively to retain business so you don’t have to keep paying Google for the privilege of sending you new customers. What do you predict will be the biggest change / or hardest hitting change in 2015? I’m not sure if 2015 is the year, but I’m convinced that in the near future Google will start using brand sentiment as an alternative to link-based metrics for its ranking algorithms. A positive brand sentiment, as evidenced through positive customer reviews and mentions online, will become a crucial factor for businesses that want to gain visibility in search. If and when this is rolled out, expect to see a massive shift in search visibility for some major brands, as well as for many smaller players. Sentiment analysis is however a notoriously difficult nut to crack, but I’m seeing all kinds of interesting technologies appear in this space, so it’s definitely something to keep an eye on and prepare for. Lastly, if you were Google what would you do to improve quality and search results? If I were Google I’d make damn sure I recognised my place in the online ecosystem. Google has broken the unofficial agreement they had with the web: that they could take websites’ content to show in their search results, and in return they would send relevant traffic back to those websites. Now Google believes that they shouldn’t be the middleman, but instead act as final destination, using whatever means at their disposal to keep people on their own sites so they can harvest more personal data and show more ads. Google still takes all your content, but increasingly it doesn’t send traffic to your site but wants you to pay for it through AdWords advertising. It’s a destructive development for the rest of the web, causing great harm to the online businesses whose websites Google used for building its empire the first place. Google needs to realise its position in the ecosystem and stop chasing after profit maximisation to the detriment of everything and everyone else. Read all contributions in the full article here.

  • Awards and Events

    You can tell it’s awards-season because just after the DANI Awards concluded (with a win for me, I’m pleased to say) the UK Search Awards are kicking off. As these UK Search Awards celebrate the true cream of the crop in the national SEO and search marketing industry, it’s with no small degree of pride I can say that two of the projects I headed up at The Tomorrow Lab are on the shortlist: Best Use of Content Marketing – Digital Printing Best Low Budget Campaign – Path XL The awards ceremony will be held in London on November 6th and, as usual, I’ll be attending, primarily to seize the opportunity to catch up with all my friends in the SEO scene. Should The Tomorrow Lab actually win one of these prized awards, I’m sure the resulting celebrations will be appropriately enthusiastic. In other news, after a very successful 6th SAScon event in June this year, the organisers are once again putting together a mini-event, this time dubbed SAScon BETA. Once again I find myself privileged to be part of the event, and contrary to the usual digital marketing insights this time the speaker brief is much wider. I’m preparing an especially rant-y talk aimed at wearable technology and what it’s doing to us, so if you’re open to be scared witless about the implications of those gadgets you’re putting in your pockets and on your wrists and head, book your tickets now and brace yourself for a truly epic rant. Closer to home, on October 28th in collaboration with VIEWdigital I’m giving a 2-hour seminar entitled Measuring Website Performance with Google Analytics and Webmaster Tools. In this seminar I will go beyond the basics and show the participants a range of interesting tricks and reporting tips to get useful, actionable insights from Analytics and Webmaster Tools that can be used to improve your website’s performance. Attendance at this seminar cost £49 and places are limited, so don’t wait to book your places. Then the next day I’m part of the panel at the SEO Masterclass organised by the Sunday Business Post. Together with Joanne Casey, Mark Haslam, and Barry Hand, I’ll be educating you on the ins and outs of effective SEO. With such a top line-up of speakers, this masterclass is unmissable for those wanting to come to grips with the esoteric arts of SEO. Here too places are limited, so book while you can.

  • DANI Digital Industries Person of the Year

    On Friday the 4th annual DANI Awards took place, celebrating the best in the digital industry in Northern Ireland. For me the awards are one of the highlights of the year, where so many of my friends and colleagues in the local digital industry get together to have a great time. This year it was extra special for me, as I was shortlisted for the final award to be handed out that evening: Digital Industries Person of the Year. It was between myself, Louise McCartan from Search Scientist, and Victoria Hutchinson from Ardmore. As you can deduce from my smug grin on the following photo, I won. Photo credit: Darren Kidd / Press Eye Being awarded such recognition for what I love doing is pretty awesome, but – as with everything in life – it’s never a purely self made achievement. In my 17 years of working in the digital industry, so many people have helped me out and given me such great support, advice, and opportunities, it would be impossible for me to thank them all. Nonetheless, whilst commiting the unforgiveable sin of leaving out so many that should be mentioned, I do want to highlight a few people who’ve been there for me over the years and without who I’d never have come this far. First and foremost I want to thank the team – past and present – at The Tomorrow Lab and the Pierce Partnership who are arguably the best collection of industry experts out there and who I’ve enjoyed working with immensely. Next a big shout out to everyone involved in the Digital Exchange networking group, who’ve had to endure more than a few of my rants, and who welcomed this vocal and slightly obnoxious immigrant warmly in to their midst. The lovely folks from the Ulster University‘s DMC programme, you’re all awesome and it’s my honour and privilege to contribute, however modestly, to the education of the next crop of digital marketing superstars. There’s so many more to mention, but the list would go on forever and I’d still manage to leave out names that deserve thanks, so I’ll just conclude with the most important people in my life: my friends who keep me honest, my family – Mom, Dad, Marlies, Monica, and Jackson – who always got my back, and my wife Alison, who is the reason for everything I do. Ever since I arrived here a bit more than five years ago, Northern Ireland has been incredibly welcoming and kind to me. This wee country punches way above its weight, and I’m immensely proud to call it my home. Northern Ireland has brought out the best in me, and I’m nowhere near done yet. :) Onwards and upwards!

bottom of page