top of page

151 items found for ""

  • Optimize Your Content For The Online Purchase Funnel

    previous entry I’ve spoken about the importance of a well thought-out clickflow that guides your website’s visitors from page to page. This clickflow is the basis of turning visitors into customers. An internet user goes through several phases of information requirements in the process of searching for and purchasing a product online. There are many similar versions of this purchase-funnel model, and for this blog post I’ll stick with a simple one: Discovery, Consideration, Purchase: In the first discovery phase the user is typing generic terms in Google hoping to find websites that offer generic, high level information about the type of product she’s looking for. You’ll get visitors on your site in this phase by optimizing your product- and category-overview pages for this type of information. Include generic terms that users will search for and make sure the information on your website contains the right type of high level information about your products. Don’t get too technical here, detailed specifications are appropriate for the second phase. In the consideration phase you can present much more detailed information about your products. These webpages should be a level below the overview pages in your website’s navigation tree and here it’s important to be more specific. In this phase a user wants to see specifications and comparisons between products he’s found in the discovery phase. Ideally you’ll want a system that compares products based on defining features and functions.  You’ll also want to include statements that may persuade a user to choose for you as a vendor. You should mention your company’s and product’s unique selling points. In the third purchase phase you’ll want to repeat these persuasive arguments on your website so that a user is affirmed in her decision. It’s also important to make the purchase process as smooth and simple as possible so that a user doesn’t encounter any obstacles before completing the purchase. The order form needs to be clearly indicated and easy to use. Remember you should strive to help your customers instead of merely sell to them. There are many more aspects of your website that can help you turn more visitors into customers, but a solid clickflow and content optimized for the user’s purchasing process are the essential ingredients. Get these right and you’ll have a solid foundation to build on.

  • Moving domains without using 301 redirects – only works temporarily?

    I discovered a bit of an anomaly with Google today. Like many people I do ego searches now and again to see how my websites and social media profiles rank on Google. For the past year and a half, my website www.barryadams.co.uk has been the primary search results for the ‘barry adams’ search query on google.co.uk. Before I launched barryadams.co.uk I had the same website on a different domain: www.greatwebsitesblog.com. When I bought the barryadams.co.uk domain name, I thought I’d try a wee experiment to see if I could change domains without having to 301-redirect every URL. So I simply pointed it to the same hosting environment where I had my www.greatwebsitesblog.com site, changed settings in WordPress to make www.barryadams.co.uk the site’s primary address (including rel=canonical tags on all pages), and submitted a Change of Address notice in Google Webmaster Tools: And almost immediately it worked. Over the next few weeks all my old listings in Google search results pointing to www.greatwebsitesblog.com were replaced with the same listings pointing to www.barryadams.co.uk. On top of that it accomplished my primary goal: the www.barryadams.co.uk site started to rank for search queries on my own name. And that way it stayed for a year and a half. Until this morning, when I did another ego search and found this as the number one result instead: This despite the fact the site has had a rel=canonical tag pointing to www.barryadams.co.uk since August 2011, despite the fact that I submitted a Change of Address notice then too, and despite the fact that all my social profiles list the new domain as my personal website. It seems Google has rewound time and decided that the old domain name should really be shown instead of the new one. I then checked how my site showed up in Google’s SERPs for a number of keywords that have been sending me solid traffic the last wee while: Nothing wrong there, all results are shown with my new domain name. So it’s only my own ego-search that results in the old domain name. Now I thought I took good care in changing all my social profile and links to point to the new barryadams.co.uk domain name when I made the switch, but due to some historic guest blogs and defunct website profiles I still have a few mentions point to the old greatwebsitesblog.com domain. Yet overall, barryadams.co.uk has been a much more actively promoted domain, and as a result has established a stronger backlink profile: Even the number of ‘barry adams’ anchor texts are now in favour of the new domain: So I’m not really sure why Google has decided to revert back to my old domain on ‘branded’ searches for my name, yet kept the new domain on generic searches for content. All I can think of is that there must be some old profile link somewhere that outweighs all the new ones, and thus manages to skew Google’s SERPs for my name to the old domain. Maybe this old profile on Search Engine People is the culprit: What I learned: I thought I’d discovered a straightforward way to move a site across domains without having to use 301-redirects, by simply pointing the new domain name to the existing site and implementing rel=canonical tags and a Change of Address notice in Google Webmaster Tools. However, it looks like Google takes those hints only temporarily in to account, and after a while the slate is wiped clean and a domain name will rank on its own merits for specific search queries. So it’s best to just stick with 301-redirects, as these seem to be of a more permanent nature. Update: Fili Wiese pointed out that the Google Support page on the Change of Address feature clearly states that ‘changes will stay in effect for 180 days’. So yes, it’s definitely a temporary effect.

  • The ‘learn to code’ fallacy

    A persistent theme that recurs every now and then in the online tech world is that ‘everyone should learn to code’. There are a slew of blog posts claiming that to be a great SEO you need to know how to code, and a recent Venturebeat article once again proclaimed programming the most vital skill for successful entrepreneurship. This is, of course, utter bollocks. Don’t get me wrong, I have a great deal of respect for coders. I work with some highly talented web developers and not a day goes by where I don’t marvel at their skills. But their skills are not essential for being a great SEO, and they’re not essential for being a great entrepreneur. First of all, I think a lot of this ‘learn to code’ hype stems from the perception that many of today’s most admired entrepreneurs started out as basement coders. From Mark Zuckerberg to Larry & Sergey, today’s biggest tech companies are the creations of coders. Therefore, the reasoning goes, you need to be a great coder to be a successful entrepreneur. And that is, of course, blatantly wrong. Such seemingly logical reasoning is premised on a thorough misunderstanding of what makes a successful tech company. The primary reason these companies are successful – in addition to exorbitant amounts of luck – is not because they were built by coders, but that they solved a problem. Google solved the online search problem more elegantly than anyone had up to then, and that made them the most popular search engine in the world. Facebook solved the mess that was social media and turned it in to a smooth, nearly friction-less experience. And that made them the biggest social network on the planet. The fact that Page, Brin and Zuckerberg are coders is secondary to their most important trait, the aspect of their personalities that is directly responsible for their success: they’re problem-solvers. Problem-solvers come in all shapes and sizes. Some of them are coders, but many aren’t. It’s true that problem-solvers tend to be drawn to writing code, as that allows them to create technical solutions to problems, but the ability to code is a symptom – not the cause – of their problem-solving ability. And not all coders are great problem-solvers. I would argue that the ability to analyse a problem – and I define ‘problem’ loosely here, encompassing everything from shoddy user interfaces to unintuitive online interactions – and devise an elegant solution is the real hallmark of successful entrepreneurship. For every Zuckerberg you can easily find several highly successful entrepreneurs who couldn’t code a simple ‘hello world’ script if their life depended on it. But I can pretty much guarantee you that all great tech entrepreneurs have one thing in common: they’re very good at analysing and solving problems. The next time someone claims that the ability to write code is the key to success in our modern world, call them out on their bullshit. Because that idea is based on a deep misunderstanding of what makes technology successful. ———- Be sure to also read this post from Richard Shove on the matter.

  • SEOs: Google is not your friend

    Search engine optimisation is an adversarial industry. With ‘adversarial’ I don’t mean that SEOs fight amongst themselves (though some certainly do), but that the business itself is one of conflict. Conflict between search engine optimisers on one side, and search engines on the other. Many seasoned SEOs will already know this, but many younger and inexperienced search engine optimisers may not fully grasp this particular fact yet: Google is your enemy. SEO is about making websites perform better in search engine results. At its core, SEO is an attempt at trying to do better than Google. An SEO is basically saying: “how Google has ranked these sites is not correct. Let me fix that.” Google doesn’t appreciate that. In a perfect Google world, there would be no SEO. Websites would rank because Google – and Google alone – finds them most relevant. Matt Cutts is the Google employee who most directly and visibly deals with the SEO industry. Through blog posts and comments, webmaster videos, conference appearances and interviews, Matt is spreading the Google gospel among the SEO crowd. In case you didn’t know, Matt Cutts is the head of Google’s webspam team. Let that sink in for a moment. The Google guy most involved with the SEO industry is responsible for dealing with spam in Google’s web search. That right there tells you all you need to know about how Google perceives SEOs. We’re spammers. We’re evildoers who pollute Google’s immaculate search results with our vile schemes and devious tactics. Google sees us as its enemy. And in many ways we are. SEOs believe websites need to be optimised to show up for keywords that are relevant to it. Google thinks it is perfectly capable itself to determine which website should show up for which keyword. It’s a continuous struggle, a tit-for-tat that won’t end until search engines themselves cease to exist. (The engineering discipline that concerns itself with fighting spam in search engines is called AIR – Adversarial Information Retrieval, which explains the adversarial nature of the SEO business.) But, you say, what about all the help Google is giving to SEOs? Their documentation, their videos, their blog posts? They must like us, they’re actually helping SEOs! No, they’re not. What Google is trying to do through their ‘support’ – be they Matt Cutts’ webmaster videos, the SEO Starter Guide, or anything else – is trying to make SEOs build spam-free websites. Google has realised it can’t kill SEO, so it’s decided to try and convert the industry instead. Google is trying to ‘educate’ SEOs in the error of their ways and convert them to the Gospel of Google. Basically, Google wants SEOs to do the hard work for them – delivering websites that are easily crawlable and spam-free so that Google can more easily decide which is the most relevant result. And it’s working. Whole swaths of the SEO industry listen to Cutts’ every word, strictly adhere to every Google guideline ever published, and try their very best never to offend Google. These SEOs are trying to be Google’s friend. Hey, guess what? Google is not your friend. Google is your enemy. No matter how nice you try to be in your SEO practices, how strictly you adhere to the big G’s guidelines, Google will always see you as the enemy. To them you are vermin. You’re a blight on the purity of the world wide web. If Google had its way, SEOs would be eradicated from the internet. So don’t for one second think that Google is your friend. It’s not. Google hates SEOs. No matter how affable Matt Cutts is – and he seems like a genuinely nice and smart guy – he is not your friend. His role within Google is to make you obsolete. Real SEOs aren’t chummy with Google. Real SEOs aren’t invited to the Googleplex for coffee. Real SEOs don’t make cameos in Cutts’ webmaster videos. Google is too scared that somehow it’ll inadvertently reveal something which a real SEO could abuse. Don’t let the friendly façade fool you. If you engage in SEO, Google really doesn’t like you. Don’t ever lose sight of that.

  • W3C compliance – is it a requirement?

    SEO agencies is W3C compliance. I’ve written about the benefits of W3C compliant code before, but my perspective has changed a bit over time and I feel it’s important to point out that full W3C compliance is not a definitive requirement for an effective website. W3C compliance basically means that the HTML and CSS code that a website is built with is fully compliant with the standards set by the World Wide Web Consortium (W3C for short). The W3C is an international standards organization, founded by the inventor of the web. They develop the standards on which the world wide web is run. You’d think that making sure your website’s code complies fully with these standards is pretty important. And it is, up to a certain point. You can easily find out if your website’s code is W3C compliant – simply submit your website URL to the W3C Validator tool and you’ll get an overview of all the ‘errors’ in your code. And you’ll almost certainly get a lot of ‘errors’. It’s very unlikely your website’s code complies to all of the W3C’s standards. I say ‘errors’ because often they’re not really errors. The W3C standards are extremely strict, with no room for interpretation. So every little niggle in your code, every small deviation from the W3C’s strict standards, will generate an error in this validation tool. W3C compliance for browsers But most web browsers are flexible pieces of software that are built to deal with a wide range of different sorts of HTML and CSS code, and will probably render your website perfectly regardless of how many errors the validation tool shows. Often web developers have to use shortcuts and non-compliant code to make something work in a particular way on a website, and while this results in validation errors it doesn’t hinder a website’s functionality at all. Quite the contrary, sometimes you have to break the rules of the W3C to get something to work exactly how you want it in every web browser. W3C compliance for SEO There is also the misconception that search engine crawlers require a website’s code to be 100% W3C compliant, or else they will rank your site lower in the SERPs. A lot of SEO agencies recommend you make every webpage on your site fully W3C compliant. This is often a costly endeavour, and quite unnecessary. Search engine crawlers, like browsers, are sturdy and flexible pieces of software that can index almost any type of code, regardless of the errors it contains. For proper crawling and indexing, a search engine will need to be able to distinguish the different elements of a webpage – style, navigation, and content – and will need to be able to interpret the meaning of the content, which it does through analysing the content itself and the mark-up code that is used to style the content. Clean, compliant HTML and CSS code help in this process. Compliant code makes it easier for search engine crawlers to identify what the content on a webpage is, and what that content means. But 100% compliance, meaning zero errors in the W3C validation tool, is not only often hard to achieve (especially if your website has advanced functionality) but is unnecessary as well. The code just needs to be sufficiently well-structured and tidy enough for search engines to be able to distinguish style, navigation, and content. So bad code is OK? No, bad code is not OK. It’s still a good idea to strive towards compliant code. A website with hundreds of W3C validation errors is not a good thing. It’s likely that these errors cause the site to display differently in some web browsers (or worse, not work at all) and can cause all sorts of trouble for both users and search engines. But if your website’s code only shows a couple of handfuls of non-critical errors, especially if they’re only small warnings, there really is little need to fix them. For on-site optimisation your time and resources are better spent on making sure your website’s title tags, content, and other factors are fully optimised.

  • Site Migration SEO Concerns – The Results

    I wrote a post summarising my research into the SEO aspects of a site migration, and I feel the time has come to look back at the migration and the lessons we learned from it. The site migration was a two-step process – we updated the design and we added new sections with fresh content. We decided to follow the recommendations outlined in my site migration blog post pretty much to the letter: Content: We phased in the new content one batch of pages at a time. We put a couple of new pages live, linked to them from the homepage, and waited for them to be indexed & cached. Then we put the next batch of new pages online. Design: The design changes were implemented gradually as well. The old and new designs weren’t radically different, it was more a tweaked & modernised version of the old design, so we felt it would be fine to have the old and new designs co-exist on the site for a while. We first did a Google Website Optimiser A/B test to make sure the new design yielded at the very least a similar conversion rate. When this was confirmed, we migrated pages to the new design one at a time. The URLs all remained the same so we didn’t have to do any 301-redirects. We used the Duplicate Content tool to ensure the HTML code and content of our key pages with high SERP rankings matched at least 90% in the old and new designs, so we wouldn’t get hit with a ranking penalty when we put the new version up. When a page was updated with the new design, we waited for it to be indexed & cached in Google and checked how its SERP rankings were affected. The end result was a site with a fresh design and new sections added, with minimal impact on SERP rankings. We did see some fluctuations in rankings but these fell well within the normal daily and weekly ranking variations. We also noted that the new content started ranking fairly soon for relevant keywords, despite no direct links coming in to those pages. This is most likely due to the incoming link value generated across the rest of the site, spilling over to the new content. It was a long and labour-intensive process, and in hindsight I’m not sure it would have impacted the rankings massively if we just switched the site over in one go. But as organic search generates a significant portion of the sites traffic and revenue, it was definitely better to be safe.

  • The Search Neutrality Debate

    With the volume of content I’m generating that can be interpreted as attacks on Google, it may appear I’m on an anti-Google crusade. But I hope you’ll believe me when I say that’s really not the case. I’m just concerned about the future of the internet and the power large corporations such as Google, Microsoft and Apple wield over it. There has been a lot of publicity recently about the concept of search neutrality: the idea that search engines should provide neutral, unbiased results and not favour their own properties or those they have beneficial relationships with. This debate has been raging for years, but it received a recent boost when Ben Edelman, assistant professor at Harvard Business School, published a survey which appears to show that Google does tend to prefer its own properties in its search results. Since the publication of the Edelman survey a growing mountain of criticism has emerged, with many claiming the study was deeply flawed. The consensus amongst my SEO peers does seem to be that the study is misleading and inaccurate. However the study focuses solely on the organic search results provided by Google, and that is only part of the picture. There are other aspects of the search result pages – such as the OneBox – where Google is blatantly promoting its own services, which might require proper scrutiny. Adding to the search neutrality debate Matt Cutts, head of Google’s webspam team, published a blog post yesterday in which he refers to an essay by James Grimmelman, associate professor at New York Law School, which seems to dissect the concept of search neutrality. I’ve left two comments on Matt’s blog – both ‘awaiting moderation’ at the time of writing this now approved and visible for all – but I felt my rebuttal of the essay as well as the points Matt makes deserve a separate blog post. First, I don’t think the search neutrality debate is one about webspam as Matt seems to suggest. I think we can all agree that spammy websites are bad and need to be ousted from the SERPs. Nor do I think the debate is about making the search algorithms public. These algos are Google’s intellectual property and thus deserve full protection. Making the search algorithms public will only play in to the hands of spammers and are likely to ensure the search results will be dominated by spam shortly afterwards, so that is definitely not a solution. In my opinion the search neutrality debate is about Google and other search engines giving preference to their own properties over those of their rivals, and I think it expands beyond the listing of organic results and should encompass other elements on the SERPs, such as the OneBox and paid ads. Having read Grimmelman’s essay I have to admit I’m not terribly impressed by it. The author starts off with his 8 principles of search neutrality, which I think should be labelled ‘elements’ instead – search neutrality encompasses several of the listed principles (though not all eight in my opinion) and by defining each individually I think Grimmelman is muddying the waters somewhat. It also makes it much easier for him to subsequently shoot them all down, having narrowed each down to easily falsifiable premises. Additionally Grimmelman erects straw men arguments for some of the definitions. For example for the objectivity principle the author states: “The unvoiced assumption here is that search queries can have objectively right and wrong answers.” This is a weasel phrase and is a misrepresentation of the objectivity principle. Also, by ignoring the interplay of the eight principles – and by including principles which are at best circumstantially applicable to search neutrality but should not form part of a serious debate (equality and transparency) – the author distorts the actual issue at the core of the matter. Search neutrality is an important and topical issue. There are genuine concerns about dominant corporations abusing their power to consolidate their positions in the marketplace, and these concerns deserve proper investigation and debate. In my opinion Google is abusing its power and does not have the best interests of its users in mind all the time. After all, Google is a publicly traded corporation and as such has one overruling legal imperative: to maximise shareholder value. Users don’t necessarily factor in to that.

  • Search Engine Advertising: a Step By Step Guide

    Part 1 – Choosing your keywords Part 2 – Writing good ads Part 3 – Create landing pages that convert Sometimes, no matter how you try, it’s just not possible to get your website listed high in the natural search results. Your competition is too fierce, you’re new in the market, you have a new product launch and you can’t wait until search engines index your new content – there are a thousand reasons why regular search engine optimization (SEO) isn’t the right thing for you. That’s not to say your website shouldn’t be optimized for search engines. Doing good SEO is never a bad thing, and will help your website in many different ways. But when SEO isn’t enough, you can choose to invest in search engine advertising – also known as Pay Per Click (PPC). Advertising on search engines can act as a supplement to (or even replacement of) SEO, as it gives you high listings in search engines for relevant keywords. The downside is that these are sponsored results, and as such will yield significantly lower clickthrough rates than high ‘organic’ search rankings. Nonetheless search engine advertising, as done through Google AdWords and Microsoft adCenter, can be a very efficient and cost-effective marketing channel to generate more traffic and business for your website. In this series of articles I’ll walk you through the necessary steps to create and perfect a PPC advertising campaign. I won’t use any single search engine as an example and so my tips and advice will be generic enough that you can apply them to any search engine marketing campaign. Search Engine Advertising – Step 1: Choosing Your Keywords I’ll start with what is arguably the most important step of your search engine marketing campaign: selecting the right keywords to advertise on. Picking the right keywords isn’t as easy and straightforward as it might initially seem. You probably know your own business inside and out and have a solid grasp of the lingo and terminology used in your industry. But do your customers share that lingo? As I’ve blogged about before, there are dangers to using business jargon. When your customers search for ‘barcode scanners’, advertising on ‘imaging device’ is probably not a good idea. It’s important to do good research into the search words your potential customers are using to find your website and those of your competitors. One tool you can use for this is Google’s Keyword Suggestion tool. Just select your language and region, type in one or more keywords, and get a list of related and alternative keywords that people are using in Google’s search engine. You can sort the suggested keywords by popularity, expected traffic and competition. Another way of using the same tool is to let it do a quick scan of your website or product page and find relevant keywords itself. Instead of using the ‘Descriptive words or phrases’ option you select the ‘Website content’ option and put in the URL of your website or product page. Google will then look at the content, determine what keywords fit the best with this, and give you a list of suggested keywords. A possible problem here is that you may not use the right keywords on your website. (Why not?) It’s smart not to simply accept Google’s suggestions at face value, but to decide for yourself what the right words are that you want to advertise on. Google’s tool isn’t the only one. There are many tools out there that can help you with finding the best keywords to advertise on. Each search engine has its own tool for finding the best keywords, and there are other free and paid tools around to help you get the best list of keywords for your search engine marketing campaign. Do a search for ‘keyword discovery‘ or ‘keyword suggestion‘ and you’ll come across dozens of websites and tools to help you further. The initial list of keywords you get this way probably isn’t sufficient to start your campaign with. PPC is a popular means of advertising, and in most search engines the position of your ad is determined by, among other factors, how much money you can spend on it. With a limited budget it’s not smart to focus on big, popular keywords where all your competitors also advertise on. Because of the popularity of those words there will be plenty of competition and that means you’ll have to pay a high price to get your ad to the top of the search results. And getting to the top is important. The lower your ad is shown, the less users are inclined to click on it. It’s important to get your ad high in the sponsored results list. This means you either need to spend a lot of money getting your ad high on search results pages for popular keywords, or you can choose to focus on more specialized, less popular search words. These more specialized keywords are called ‘long-tail’ keywords. They’re usually a bit longer than regular keywords, consisting of two, three or even four seperate words. They’re not the words that users tend to start with when they search, but users who do use these longer keywords tend to have a pretty clear idea of what they’re looking for. That means the traffic you get from these long-tail keywords is more likely to actually buy from you. And because these long-tail keywords aren’t used as much, you’ll have to spend less money to get your ad listed high. So while you may get less traffic, you might end up with much more bang for your buck. An example: say you have an online furniture store. You sell a lot of different furniture, but you specialize in colonial-style wooden furniture. You can choose to advertise on keywords such as ‘furniture’, ‘sofa’, ‘cabinet’, ‘chair’, and so on, but these are all big, popular search words with a lot of competing advertisers. A smarter strategy would be to focus on more specialized long-tail keywords such as ‘colonial furniture’, ‘modern antique cabinet’, ‘classic style sofa’, etc. These words are less popular, which means less traffic but also much lower cost to advertise on. And people using those search words already know approximately what they want, so if you send them to the right offer on your website you’re much more likely to turn them into customers. The next article in this series focuses on writing good advertisements for your PPC campaign.

  • SEO for Google News – Ranking Factors and Recommendations

    Many news organisations receive most of their website traffic from Google News – the dominant news destination for users online. Google News is a very different animal from Google Search, and SEO for Google News needs a very different approach. It’s vital for news publishers to have a good understanding of how Google News operates and what can be done to optimise your presence there. There’s an extensive FAQ on Google about how to get your site included in Google News, so I won’t discuss that topic here. Instead I’ll focus on what you can do to maximise your exposure there once you’re included. The following ranking factors are distilled from my own experiences with a large regional news site, as well as various online sources including interviews with Google staff, Google’s own FAQs and videos, research papers and patents published by Google, and analyses performed by other SEO professionals. Google guards its ranking algorithms fiercely. As a result we don’t know how many other ranking factors come in to play, nor what weight each factor has in the overall ranking algorithm. Google News Ranking Factors Original Content An article that is unique to a publisher has a much higher chance of ranking than an article that is taken from a news syndication feed or republished from another source. Note that Google is striving to show every article under the original publisher’s banner. So content republished from other sources (such as AP and Reuters) are much less likely to show up in Google News as part of your site than our own original content. AP and other news agencies are also working hard to ensure they capture the web traffic for their own content. Additionally, if you have content that refers to an original source (i.e. “The New York Times reported that…”) Google News could detect this and will rank the original NY Times article higher. Timeliness A bit of a no-brainer: news articles that are more recent and tie in with current events are preferred over older articles. Coverage of recent developments Nowadays Google News is able to detect updates to an already indexed article. News articles that are updated to reflect ongoing developments in the story are preferred over static stories. Cluster Relevancy Google divides news articles in clusters centred on a single topic (an algorithmic feature Google calls Aggregated Editorial Interest). The more relevant an article is for that cluster, the higher it is likely to rank. Local source & content If a story has a location element in it, Google News tends to prefer articles from publishers geographically close to the story’s focus that create their own local content for the story. I.e. for an event in Belfast that is covered by both the Belfast Telegraph and a national newspaper, the Belfast Telegraph’s coverage is likely to rank higher in Google News. Publisher Reputation This is a complex ranking factor that depends on a number of factors in itself. One important factor for determining publisher reputation is the volume of original content per news edition that the publisher produces. A publisher that produces a lot of original content for different news editions is seen as more reputable than niche content produces and news aggregators. Google News defines ‘editions’ as separate categories of news, such as sports, politics, and entertainment, but also its own country-specific versions (news.google.com, news.google.ca, news.google.co.uk, etc). It’s important to note that publisher reputation is mostly independent of a website’s PageRank (PR is said to be applied ‘delicately’ to Google News), and that the reputation can be different for each edition. Thus it is possible for a news site to have a great publisher reputation for politics in news.google.ca, but a very poor publisher reputation for sports in news.google.com. Clickthroughs An article with a high CTR is seen as more relevant – with every click counting as a ‘vote’ for the article – and is thus more likely to rank higher. On-page Optimisation The concepts of generic SEO also apply to Google News as well. Factors such as search-engine-friendly URLs, good title tags, use of header tags, strong body content, and optimised code, all factor in to Google News rankings. Images The thumbnail images that accompany stories in Google News are usually JPEG images, have a relevant caption and alt text, and aren’t clickable (so the image doesn’t link to another page). The latter is because Google News wants the best image to be part of the article, so users don’t have to perform an additional click to see the best possible image. Personalisation Recently Google has started to personalise Google News based on collaborative filtering. This is much like Amazon.com’s recommendations system. An example: User A reads articles 1, 2, 3 and 4 on Google News. User B reads articles 1 and 3. Google News then personalises the News page and shows articles 2 and 4 more prominently, as it suspects user B will want to read these as well. A research paper on Google’s implementation of collaborative filtering in Google News has been published and can be read here: http://www2007.org/papers/paper570.pdf Google News Search Patent In 2003 Google filed for a patent for “systems and methods for improving the ranking of news articles”. The patent was granted in 2009. Much has changed since 2003 so it is very likely rankings in Google News work very differently nowadays from what this patent describes, but we can still learn a few things from the patent’s ranking factors: Number of articles produced by the source Article length (longer = better) Breaking news score Clickthroughs Human opinion (awards won, survey results, etc) Newspaper circulation numbers Editorial staff size Number of associated news bureaus Inclusion of original named entities (people/places/organisations) Number of topics the source produces content for International diversity of audience Writing style (spelling, grammar, reading level) Some of these factors are likely still part of the Google News ranking algorithm in some form or another, such as clickthroughs, number of topics, and breaking news score. Other factors are unlikely to be a part of the current workings of Google News (circulation, staff size). The full patent text is available here: http://www.faqs.org/patents/app/20090276429 Recommendations From these ranking factors a number of recommendations follow that you should keep in mind when creating content for your news site, as well as any technical changes you make to the site: Publish unique content: Strive to publish as much unique, original content as possible. Publish & update fast: By being early with breaking news, as well as keeping on top of new developments, you can increase your chances of ranking high in Google News. Minor article tweaks can be interpreted as a developing story update, and are thus encouraged if applied inconspicuously. Develop editorial specialities: You can increase your publisher reputation in specific news editions by developing a speciality for a certain type of news. For example you could strive to cover your regional politics better than anyone else, and thus increase your chances of outranking big news publishers in Google News for local political news. Optimise your site for general SEO: Like with any other site, it pays to optimise things like title tags, URLs, header tags, etc. Images should be JPGs and non-clickable: By making sure all images used on your site are JPGs, and that images included in an article are not linked, you can increase your visibility in Google News. Having a good caption for your images also helps. Google News Sitemap: While having a Google News sitemap doesn’t help your rankings in Google News, I still consider it essential to have one, if only to ensure all your content is found and indexed by Google’s news spiders. Note that these recommendations come from my point of view as an SEO specialist, and I reckon they would benefit from a journalistic perspective. Also note that this document is a snapshot of the state of Google News as it exists now. Google rolls out updates and tweaks all the time, so these ranking factors are likely to change over time. If you’re a publisher in need of SEO and want to improve your visibility in Google News, Polemic Digital offers specialised SEO services for publishers.

  • Keep Your Forms Short And Simple

    As a result users are less inclined to type a lot of information in a website’s form. Whether it’s a contact form or an orderform, users will be reluctant to give you their information. Many research studies show that elaborate web forms turn users away. Every field you add to a form will make it more likely a user will not fill it in and simply go somewhere else.  Especially form fields like address and phone number throw up barriers for users that are concerned about their privacy. It’s therefore important to keep the forms on your website as short and simple as possible. A mistake I often see is that companies base their forms on their own internal wish-list of customer information. Especially sales people want to have as much information on their customers as they can get their hands on. This usually leads to long forms that request a lot of information from users, often with little to no reward for the user when he fills it all in. It’s necessary to use forms on your website, as a form makes it easier for a user to get in touch with you. But when you ask for much more information in the form than what you’d ask for if the customer simply phoned you, you’re not likely to get a lot of submitted forms. Whenever you create a form for your website, keep these guidelines in mind to ensure your visitors will feel comfortable filling it in and giving you their information. Only ask for the absolute bare minimum. For generic contact forms the name, email address and message fields are enough. For online orderforms only ask for the minimum information you need to properly complete the order process. Any additional field risks a potential customer turning away and going to a competitor. Reward your users for giving you their information. If you really, really need to ask a lot of information from your users, give them a reward that fits the amount of information you’ve requested. This reward can be in the form of a free downloadable ebook or white paper, a possibility to win a prize like a mp3 player, or another reward that fits with your target group. Make sure this reward is clearly indicated on the form itself. Give your form proper context and explanation. Don’t just put a form up on a web page without any explanation. The best forms are those that are short and simple and clearly indicate to the user what happens with their submitted information. Encourage your users to submit the form. By using action words such as “submit now”, “learn more”, and “sign up today” you encourage your users to fill in the form and will make them feel good about doing so. Include a privacy policy. Link to your privacy policy and be sure that it states you will never give your users’ information to any third party. Your privacy policy needs to be in plain language as well – hiding your intent behind cryptic legalese will not engender any trust. It also helps to state clearly on the form itself that you won’t share your users’ information. Use a “thank you” page. When a user submits the form, send them to a “thank you” page where you confirm what you will do with their information, such as replying to the customer’s inquiry, giving them the link to the downloadable reward, enrolling them in the prize draw, etc. Measure the submission rate.  Track how many submissions you receive compared to how many page views the form itself gets. If the submission rate is very low, you’ll need to tweak your form even more. A submission rate of 20% is a good figure for generic contact forms, so don’t be surprised if your form does a lot worse than that. Use a simple CAPTCHA [?] to ensure your submitted forms actually come from humans instead of automated spam robots. Some CAPTCHAs are overly complex and difficult to read even for humans, which leads to real people abandoning your forms instead of just spambots. Simple forms pay off in the long run. You may generate some additional work for yourself or your sales people with the limited information you receive, but it will result in many more contact moments with your clients and eventually in more paying customers.

  • Build Conference – a must-attend for web designers

    I had the honour and privilege to be present at yesterday’s Build conference, an annual (web) design conference hosted in Belfast’s Waterfront venue. Organised by local Northern Irish talent Andy McMillan, Build is one of those conferences that provides nourishment for the design-geek’s soul: cool schwag, great talks, and more Mac logos than I was comfortable with. It even boasted a caffeine monitor that kept track of the amount of caffeinated beverages consumed by conference delegates. From Click To Tap – Keegan Jones & Tim Van Damme The first talk was by Keegan Jones and Tim Van Damme who both look and talk like stereotypical web geeks. They spoke about design for mobile, specifically mobile apps, and gave some great tips on how to make the best use of limited screen real-estate and what to keep in mind when you embark on your mobile web/app journey. More Perfect Typography – Tim Brown This presentation by soft-spoken – but very intense – Tim Brown appeared to be one of those typical design obsessive things, but sometime halfway through the talk it suddenly clicked for me. Tim Brown makes the case that web design should start with a choice of type, as this not only colours the content (try reading a piece of text in Times New Roman, and then in Comic Sans, and see how different you interpret it) but can also help you scale your entire design. By using your chosen font’s optimal size as a starting point and then scaling up with the use of for example the Golden Ratio (1:1.618) you can create a design that somehow fits well and feels right. The Shape Of Design – Frank Chimero Where the first talk was done by typical web geeks, Frank Chimero is a typical design geek – tweed jacket, hip tie, and Apple-addicted. His talk was a somewhat rambling affair about the role of a designer and what the perceived and real added value of design is. It all boiled down to that wearing old mantra that we have to be authentic and real and somehow try to ‘tell stories’, whatever that means. Don’t get me wrong, it was an entertaining talk, just not particularly innovative or insightful. Adding By Leaving Out – Liz Danzico I didn’t take a lot of notes during this talk which is very appropriate, as Liz Danzico talked about the power of omission. Liz spoke about how silence can have a lot of meaning and how white space is an active element of a design instead of a passive background. While interesting and thought-provoking, the talk lacked concrete advice – which was probably intentional, as Liz likely meant to inspire rather than lecture. Conquer The Blank Canvas – Meagan Fisher Meagan Fisher, a self-proclaimed owl-obsessive, laid out her four-step design process in this talk. She seemed a bit nervous on stage (and who wouldn’t be, being stared at by 300+ geeks and nerds) but she really didn’t have any reason to as her talk was probably the most fascinating and insightful one – for me at least. Not only did she gives us a great insight in to how she manages her design process and deals with each facet, her slides were also the most visually astounding. This talk delivered a double-whammy, as Meagan’s design process gave the audience very useful tips and insights and her slides served as a rich source of design inspiration as well. Due to other obligations I missed the last talk of the day which was Dan Cederholm talking about handcrafted CSS, but as Dan’s reputation precedes him I have no doubt that it was a superb talk. While these talks form the core of the Build conference, they only take up one day of what is an elaborate and highly entertaining week full of activities including workshops, a pub quiz, lectures, and even a film showing at Queen’s Film Theatre. All in all I can say that Build is a conference every self-respecting (web) designer should try to attend. Some speakers have already been confirmed for the 2011 edition, and if you are at all involved in web design I highly recommend you try to be there.

  • Search Engine Advertising: a Step By Step Guide – Part 3

    Part 1 – Choosing your keywords Part 2 – Writing good ads Part 3 – Create landing pages that convert In part 1 of this guide to search engine advertising we discussed how to choose the right keywords to advertise on. In part 2 we showed how to make effective ads. Now in the final part we’ll tackle landing pages. Search Engine Advertising – Step 3: Create Landing Pages That Convert You’re targeting the right keywords and your ads generate a lot of clicks, so now you have all these extra visitors coming to your website. You’re paying for these visitors, so you want to turn as many of them into customers. The best way to do this is not to send them to your website’s homepage, but to a custom-built landing page. Image you’re an average internet user. You’re looking for a new sofa for your living room, and you do a Google search for sofa’s. You see an ad on the search results page which appeals to you, so you click on it. You end up on a general furniture website’s homepage and now you have to look for their sofa’s section. Chances are you don’t have the patience for this, and you use the back-button to return to the search results and try a different website. So you click on a second ad that seems interesting, and this time you land on a webpage that talks only about sofa’s. It shows you pictures of sofa’s, it has a nice offer for a discounted sofa on it, and there are many links from this page to various different categories of sofa’s. This site appeals much more to you, and you’re likely to stick around longer and maybe even order a sofa from these guys. The first advertiser you clicked on made the classic mistake of sending PPC traffic to the website’s homepage. From the homepage a web user needs to start his search all over again, navigating your website until he finds what he was looking for in the first place. The user has to go through more clicks and has to invest additional effort, something internet users are notoriously unwilling to do. It’s much better to send the user straight to what they want to see, which the second advertiser does. This way the user doesn’t have to find his way through your website. He immediately sees content that is relevant to his search query. Elements of a good landing page 1. Relevance: First and foremost the page you link to from a search engine advertisement needs to be relevant. Just like the ad needs to contain the keyword you advertise on, so does the landing page. If you advertise on the sofa keyword, your ad contains the word sofa, you can’t send users to a landing page discussing kitchens or chairs. You need to send them to a page that talks about sofa’s. Be sure to include the actual keyword you advertise on clearly visible on the landing page, preferably in a headline. This tells a visitor that the landing page is relevant to the search query they typed in to start the whole process. If you use several different phrases to say the same thing, you’ll probably have to make different landing pages for each or use dynamic HTML code to show the exact keyword the user searched for. When you advertise on many different types of keywords, you will have to create a lot of different landing pages. It’s a lot of work, but it will always pay itself back in a higher conversion rate, higher revenue, and more return on your advertising investment. 2. Clickpath: Sometimes the landing page can be the conversion page. If you offer a downloadable ebook or small specialized item, your landing page can also be the page where users can place an order. But often you’ll need to give your users additional information to guide them to a conversion – product options, specifications, accessories, etc. Guide them through the sales funnel, from general overview to detailed information to actual conversion. An important aspect of the clickpath is that you shouldn’t make it too easy for users to diverge from it. Take away your regular site navigation if you can, try to keep the visitors of your landing page in a clickflow that guides them to a conversion. If users sidestep your clickpath and instead go to your site’s homepage or another page, chances are you’ll lose them there. 3. Calls to action: Once you get a user to click on an ad and arrive on your landing page, don’t leave them hanging. You need to take them by the hand as it were, show them where to go and what to do. Use action words like ‘learn more, ‘click here’, ‘order now’ in your content and in your links to additional pages. 4. Easy conversion: It should be as easy as possible for a user to place an order. Don’t ask them for information they don’t really want to give up. Keep your forms short and simple and only ask for the very basic information you need to complete the order. 5. Persuade: Selling is the art of persuasion. Employ tried-and-proven persuasion methods such as testimonials, special offers, guarantees, authorative sources, instilling confidence, and more. Include them on the landing page itself and on every subsequent page you send your users to. 6. Fast loading: Your landing page should load very quickly. If a user has to wait for a bit before your landing page is displayed properly, the urge to click that back-button will grow. Make your landing pages lean and efficient to optimize loading times. 7. Measure: It’s not as simple as putting your landing page out there and waiting for the money to come pouring in. It’s imperative that you know what users are doing on your landing page. Do they stay and read your content or do they leave? What links do they click on? Do they convert into customers right away or do they bookmark the page and come back later? Do they follow the whole clickpath or do they leave prematurely? If so, where do they tend to leave your site the most? All these things and more can be measured and analyzed with a good web analytics package. A good place to start is Google Analytics, a free service that contains all the web analytics functionality you’ll need. Use the data you gather to make informed decisions about what to improve on your landing page (and your website as a whole). 8. Experiment: Creating good landing pages is never an exact science. All aspects of a landing page, from the headline to the color of the buttons, can have an impact on the conversion rate. Experiment freely, but do it in a controlled manner. Tools like Google’s free Website Optimizer allow you to perform extensive tests with all kinds of different aspects of your landing page to optimize your conversions. Don’t test too many things at once – experiment with one or two changes at a time, no more. Allow your test to run for enough time before you make up your mind. And once you find a landing page setup that works well, use that as the basis for a new round of further tests. Never stop testing and improving. 9. Conform to the guidelines: Last but certainly not least, be sure to read the editorial policies and guidelines of the search engine you advertise on. Google, Yahoo and Live all have strict policies about advertising on their search results pages. There are guidelines you’ll need to conform to for your ads and your landing pages, or you’ll risk paying more for each click or worse, not getting your ads shown at all. Conclusion Creating good landing pages for your Pay Per Click campaign is not an easy and straightforward task. To do it well you’ll need to invest a lot of time and effort in building and perfecting your landing pages. But it’s never a wasted effort. Again and again the results show that good landing pages turn many more visitors into paying customers, and help earn back the money you invest in seach advertising several times over.

bottom of page