151 items found for ""
- Technical SEO is absolutely necessary
- Newsletter: SEO for Google News
I’ve been neglecting this blog for a few months, which has been entirely unintentional. With the COVID-19 pandemic, our lives have all drastically changed and priorities have shifted. Blogging couldn’t have been further from my mind. But now things have settled into a new routine, and my creative itches have once again emerged. The problem was, I didn’t know what I should blog about. So recently I put out a feeler on Twitter to ask my followers what I should write about, and one reply in particular stood out: I thought that was an excellent suggestion and it really got me thinking. I’ve been crazy busy with work these past few months (I’m counting my blessings, I realise not everyone has been as lucky), and most of those projects have been for publishers. SEO for news is an area I’ve been fortunate enough to build a speciality in; first through a full-time job working for the Belfast Telegraph back in 2010, and then since I started my own consultancy business through a series of SEO consulting projects with major publishing organisations like News UK, FOX, Future Publishing, Investing.com, and many more. While I work with all sorts of companies, the publishing industry is the one I enjoy working with the most. So, as a speciality in SEO, news publishing is something I want to double down on. SEO for news publishers is an area I’m deeply excited about. While it shares the same foundational SEO best practices with every other niche, there are some areas of SEO that are unique to publishers. The news-specific elements of Google’s ecosystem are quite different from their regular search results, and this requires a different approach to optimisation. The SEO blogosphere is cluttered, with hundreds of websites regularly writing about all areas of SEO. Except when it comes to Google News. This is a speciality in SEO that is massively under-serviced in terms of content. There are a handful of good pieces out there, but these tend to focus on the very basics only – for example, on how to get a website into Google News – and there’s very little (if any) content out there that discusses some of the more advances concepts and challenges that news publishers face when it comes to maximising their search traffic. Which is odd, because SEO is so incredibly valuable for publishers. For most news websites, Google is by far the largest driver of traffic – primarily through Top Stories carousels, but increasingly also via Google Discover. This lack of useful and up-to-date information about SEO for publishers is hindering the media landscape. Publishers are struggling to attract readers and are missing opportunities to claim visibility in Google. This something I want to help change. Over the years I’ve learned a lot about how publishers can use SEO to boost their traffic. I’ve worked with development and product teams to improve websites for better crawling & indexing of news content, I’ve trained journalists and editors to optimise their content for improved visibility, and I’ve consulted on a range of projects aimed at integrating SEO practices into newsroom workflows. And I have learned new things from each and every project, and continue to learn every day. It’s time I started sharing this knowledge. But rather than just blog an occasional article and hope it reaches the people who might be able to use it, I’ve decided to take a different approach. The firehose of the SEO blogosphere is not conducive to publishing meaningful content on such a relatively narrow niche. So I’m going to be doing this via a newsletter. Inspired by great email newsletters from people like Aleyda Solis and Louis Grenier, I want to try and build a community around an email newsletter specifically about Google News. I’ve chosen the Substack platform for this, as it has all the bells & whistles I could possibly need – and it takes very little effort for me to set up and manage, so I can focus on the actual content. The first few topics I have in mind are around things like technical optimisation for articles, best practices for syndicated content, and integrating SEO into newsroom workflows. As time goes on I’ll try to cover as many different areas as possible, all around the overarching topic of maximising traffic to news publishers from various search sources. If this sounds interesting for you and want to read this content, you can sign up using the form below or directly on www.seoforgooglenews.com: The first issue was sent out on 18 November, and I hope many more will follow. If you sign up, I promise I will only use your email address to send you this newsletter – it will never be shared with anyone else in any capacity. Let me know what you think of this newsletter experiment in the comments below. Is it something you’ll sign up for? If not, why? Any and all feedback is welcome.
- Perfecting XML Sitemaps
[This article was originally published in August 2019 on Search News Central] An XML sitemap is a file, in XML format, that lists URLs on your site that you want search engines like Google to crawl and index. XML sitemaps have been a staple of good SEO practice for many years now. We know that sites should have valid XML sitemaps to help search engines crawl and index the right pages. Yet despite the ubiquity of XML sitemaps, their exact purpose isn’t always fully understood. And there’s still a lot of confusion about the ‘perfect’ setup for an XML sitemap for optimal crawling and indexing. In this post I’ll share my own best practices I’ve learned over the years for fully optimised XML sitemaps, focusing on standard sitemaps for webpages. The Basics I’m not going to explain the basics of XML sitemaps too much, as those have been covered many times over on many other blogs. I’ll just recap the essentials here: XML sitemaps should adhere to the official protocol, otherwise Google will not see it as a valid file and will ignore the sitemap. XML sitemaps should only contain the canonical URLs on your website that you want search engines to crawl and index. You can submit your XML sitemap to Google and Bing directly through Google Search Console and Bing Webmaster Tools, as well as reference it in your site’s robots.txt file. Search Console and Webmaster Tools will report on the URLs included in your XML sitemaps, whether they are indexed and if there are any errors or warnings associated with them. There are separate XML sitemap types for webpages, images, videos, and news articles. In this article we’ll focus only on XML sitemaps for standard webpages. Sitemap Attributes XML sitemaps support multiple attributes for a listed URL. The three main attributes for every listed URL are the last modified date (), the priority from 0.0 to 1.0 (), and how often the content on the URL is expected to change (). Many XML sitemaps will have all three of these attributes defined for every URL listed in the sitemap. However, most search engines – Google included – only pay attention to one of those attributes: the date. When a URL has a date that is more recent than the last time the URL was crawled by the search engine, it’s a strong indicator that a URL should be re-crawled to see what has changed. As such, I always recommend making sure the attribute is accurate and updated automatically when a page on the site is changed in a meaningful way. Most XML sitemap generators, like the Yoast SEO plugin for WordPress, will ensure the attribute is automatically updated in the XML sitemap when a page is changed in the site’s backend. The other two attributes, and , are seen as too ‘noisy’ to be used as proper signals. Often these are set incorrectly or manipulated to try and trick search engines in to crawling pages more frequently than necessary, so they tend to be ignored by most crawlers. I tend to recommend leaving out these attributes entirely. It makes the XML sitemap’s file size smaller, and results in less clutter which makes sitemaps easier to troubleshoot. Sitemap Size In Google’s support documentation on XML sitemaps, they say a sitemap file can’t contain more than 50,000 URLs and must be no larger than 50 MB uncompressed. If your site has more than 50,000 URLs, you can break them up in to separate sitemaps and submit a so-called sitemap index – an XML sitemap that lists only other XML sitemaps. And you can submit 500 sitemap index files, each listing a maximum of 50,000 individual sitemaps. Which means the total amount of URLs you can submit to Google via XML sitemaps is: 500 x 50,0000 x 50,000 = 1,250,000,000,000 (one trillion two hundred fifty billion) That’s more than enough for even the most excessive websites. However, in my experience it’s not ideal to fill XML sitemaps up to their maximum capacity. For larger websites with hundreds of thousands or millions of pages, ensuring Google crawls and indexes all URLs submitted in XML sitemaps is quite challenging. Cramming every XML sitemap full with 50,000 URLs often leads to incomplete crawling and indexing, with only a small fraction of the submitted URLs included in Google’s index. I have found that limiting sitemaps to only 10,000 URLs leads to more thorough levels of indexing. I’m not sure why – I suspect that smaller lists of URLs are easier for Google to process and crawl – but it’s been proven time and again that smaller sitemaps lead to higher degrees of indexing. As a result, I always urge large websites to use smaller XML sitemaps – but not too small! Some huge websites limit XML sitemaps to 1000 URLs, which means you end up with thousands of individual sitemap files. This too brings complications, as Google Search Console will only list 1000 sitemap files in its Sitemaps reports. If you have more than 1000 individual XML sitemap files, you will not be able to get a complete gauge of their performance in terms of indexing from Google Search Console. A happy medium is to limit XML sitemap files to 10,000 URLs each. I’ve found that this is a good compromise on size, in that it ensures a higher degree of crawling and indexing than a 50,000 URL sitemap, but at the same time doesn’t create reporting limitations in Google Search Console. A maximum of 10,000 URLs per XML sitemap seems to be a good middle road between indexing and reporting. This was first explored by Nick Eubanks, and I’ve seen similar good results from this 10k limit on XML sitemaps. Sitemaps by Content Type When analysing indexing problems on websites, XML sitemaps can be very useful. However, if all URLs on a website are simply heaped together in XML sitemaps regardless of the purpose of each URL, then troubleshooting SEO issues becomes more challenging. A great way to make XML sitemaps more useful and helpful is to separate them out by content type, so that there are different XML sitemap files for different types of pages. For example, on an ecommerce site you should have different XML sitemap files for your static content pages (about us, terms & conditions, etc), your category and subcategory pages (hub pages), and your product pages. Or, alternatively, you can also create separate sitemap files for each category of products, so that you can quickly see which product categories are well-indexed and which ones aren’t. Combining the two approaches also works, where you have separate XML sitemaps for each category’s hub pages and product pages. For news publishers, I recommend separate XML sitemap files for news sections, and to list your articles in different XML sitemaps. This is because we want to make sure Google has indexed every section page on the site (as these are important for new article discovery), whereas achieving 100% indexing for all individual articles on a news site is extremely difficult. Keeping articles in separate XML sitemaps from section pages means you can troubleshoot potential issues more effectively and get better data on the index performance of both types of pages. Additionally, news publishers should have a news-specific XML sitemap that only lists the articles published in the last 48 hours. This aids Google with discovering your newly published and recently updated articles. Discovery vs PageRank Flow One common misconception about XML sitemaps is that they can replace a regular crawl of the website. Some people think that by having a good XML sitemap, the website itself doesn’t need to be fully crawlable. After all, they reason, the URLs we want Google to crawl and index are listed in the XML sitemap, so the website doesn’t need to have crawlable links to these URLs. This is entirely wrong. The primary mechanism through which search engines discover content is still crawling. Your website needs to have a good internal link structure that enables crawlers (and your website’s visitors) to find all your important pages with as few clicks as possible. And, more importantly, links enable the flow of PageRank (link value) through your site. Without PageRank, your website’s pages aren’t going to rank in search results. XML sitemaps in no way replace internal links. XML sitemaps don’t distribute any link value, and they don’t guarantee indexing and ranking of your website’s pages. Sitemaps are a supplementary signal for Google and support a website’s internal linking and canonicalisation – they are not intended to replace a proper crawlable website. You should always make sure your website is fully crawlable, and that all URLs listed in your XML sitemap can also be discovered by simply clicking on links on your site. If a URL is listed in a sitemap but doesn’t have any links pointing to it, Google is very unlikely to crawl the URL and even less likely to rank it in its search results. In a Nutshell Well-crafted XML sitemaps can help your website’s crawling and indexing by search engines, but for me the main purpose of sitemaps is to help troubleshoot SEO issues on your site. The data reported in Google Search Console on XML sitemaps is the real reason you want to have good sitemap files. Keep your sitemaps relatively small and focused with no unused attributes and no more than 10,000 URLs. Separate them out for different content types, and always make sure that URLs listed in your sitemaps are also fully discoverable through a web crawl. Good luck and if you have any comments or questions about XML sitemaps, use the comments below and I’ll try to respond as best I can.
- Online Technical SEO Training Course
I’ve been delivering my technical SEO training course in person for several years now. It’s been a very rewarding experience, with full classrooms and great feedback from the participants. Delivering these training courses in person has always felt like a competitive advantage. The interactive element of my training is part of the appeal. I encourage my students to ask any questions they want, either during the training, in the breaks, or after the session. I always try to set the ground rule that there is no such thing as a stupid question, and I want every participant to feel empowered to ask whatever they want to make sure they get maximum value from the training. We’ve toyed with delivering this training in an online format for a long time. Now that most of us are stuck at home, it feels like the right time to take the plunge and see if we can do this. So my technical SEO training is going online! I want to preserve the interactive element of my training as much as possible, which means it’ll be delivered live. No pre-recorded videos or anything like that, it’ll be exactly like a classroom except it’ll be done via Zoom. And instead of one long day, we’ll spread the training out over two half-day sessions. The first online training will be delivered on 27 & 28 August 2020 in two morning sessions (UK/Ireland time). The training content will be the same as my classroom training, though I might tweak it a bit to facilitate the online format – I expect I’ll be able to cover a bit more ground online, so I am likely to put some more content in to the training. Because it’ll be delivered live, we want to keep the number of participants limited to encourage interaction and make sure everyone gets maximum value from the sessions. So if you’re interested in the training, make sure you book your spot soon!
- Introduction to PageRank for SEO
- Google Guidance for News Coverage
In December of last year Google made some drastic changes to Google News, specifically to how it selects websites that feature in the news vertical and related areas of Google’s search ecosystem such as the Top Stories carousel and the Discover feed. Previously, news publishers had to apply to be included in Google News and there was a manual verification process. In the current Google News, sites and articles are automatically selected and publishers do not need to apply to be included in Google News. As per the official support documentation: This seismic shift in Google’s approach to news publishers was hidden among the support documentation surrounding the new Publisher Center, which replaced the old partner dashboard that Google News-approved publishers had access to. In this new Publisher Center, publishers can control certain aspects of their sites’ visibility in Google’s news-related elements, such as their branding and topical focus areas. Additionally, getting through the new Publisher Center’s approval process means a site can be included in the Newsstand app on Android mobile devices, and opens up additional monetisation opportunities. I’ve gotten a lot of questions from publishers whether they need to go through the verification process in the new Publisher Center to be included in Google News. The answer is no, you don’t need to be verified in the Publisher Center to show up in Google News or other news-focused areas of Google. However, I do recommend going through the process to ensure your site is properly categorised and branded in Google’s news ecosystem. Since that initial launch of the new publisher center and the abandonment of the manual approval process for Google News, their support documentation for publishers has steadily expanded to provide more details on the new approach to news content in Google. In addition to clearer guidance on how sites can now be considered for Google News, the official webmaster blog has also published information about best practices for publishers regarding news coverage of current events. In this new guide, Google is of course emphasising their AMP standard (my thoughts on which can be read here, nonetheless I do recommend publishers implement AMP lest they cut off their nose to spite their face). Google highlights the importance of adding article structured data to your AMP articles, especially the article’s publication date: “We also recommend that you provide a publication date so that Google can expose this information in Search results, if this information is considered to be useful to the user.” When you’re live-streaming a news event, Google wants you to use the BroadcastEvent structured data markup and submit your relevant content through their Indexing API. “If you are live-streaming a video during an event, you can be eligible for a LIVE badge by marking your video with BroadcastEvent. We strongly recommend that you use the Indexing API to ensure that your live-streaming video content gets crawled and indexed in a timely way. The Indexing API allows any site owner to directly notify Google when certain types of pages are added or removed.” This public acknowledgement of the indexing API leads me to believe Google will be putting more focus on that technology. I wouldn’t be surprised if later in 2020 Google will allow news publishers (and perhaps all publishers) to start tapping in to the API to get their content quickly in to Google’s index. While potentially subject to abuse, a public indexing API makes perfect sense for a search engine that operates at the scale Google does; it basically moves the effort of discovering new content from Google’s crawlers to publishers’ technology stacks. So, essentially, it’ll save Google money. Lastly, in this webmaster blog post Google advises publishers to ensure their AMP articles are also updated in Google’s AMP cache whenever changes are made to the articles. This is obviously something Google struggles with, as once an article is cached in the AMP cache it’s not always updated when the publisher’s version changes. Hence Google needs publishers to tell it when an AMP article has changed. This latest webmaster blog from Google is quite technical and focused on a narrow niche (news publishers). It shows Google wants publishers to become more technically adept at maximising their content for News visibility. It’s an area of news SEO that I also focus heavily on and hope to share more of my insights and experience with in the coming months at relevant events and through this blog. I feel Google is close to perfecting its topical evaluations of news publishers when it comes to which sites to trust and for which topics. Yet the technical realities of news SEO are still somewhat lagging behind Google’s envisaged ideal scenario. Publishers will need to ensure their websites are constantly improved and stay abreast of the demands Google places on their technologies. No matter how good your news content is, it’ll only be surfaced in Google search if the search engine can properly process it. This is not something you just want to take for granted. Google’s technology keeps changing and progressing, which means your news site needs to do the same.
- How SEO for News can help all websites
- Are Boris Johnson’s PR People Manipulating Google Search?
Anyone remember the ‘Boris bus’? The pledge plastered across a red London bus to give £350 million to the NHS after the UK leaves the European Union? Here’s a reminder. For a long time, when you searched for ‘boris bus’ in Google you’d see many references to this Brexit campaign promise. So many references, in fact, that it became a bit of an embarassment for Boris Johnson, as so far it has seemed to be a rather empty promise. Hence why, in a June 2019 interview, Boris Johnson’s admission that he likes to ‘paint buses’ as a hobby raised some suspicion – primarily because it seemed to be a carefully crafted proclamation designed to game Google’s news algorithms. First highlighted by the folks at Parallax in Leeds, this tactic did seem to have the intended effect initially when the ‘boris bus’ search result changed to show the interview’s statement rather than the big red Brexit campaign bus. Then there was a bit of a backlash as some people caught on to the perceived deception, and news outlets like the Daily Mail wrote about it and these stories started to dominate Google’s results. Ironically, doing the same search today yields results about the bus’s manufacturer going in to administration. So that initial attempt to game Google’s search results seems to have misfired a bit. Yet, this doesn’t seem to have discouraged the people behind Johnson’s PR spin machine. This week, it seems, the PR folks responsible for scripting Johnson’s public statements are giving it another attempt. Take these two search results for ‘boris model’, screenshotted a few hours apart by TheAndyMaturin: Once again this seems carefully crafted to shift public attention away from an embarassing story for Boris Johnson, using language designed to make it in to article headlines that then replace existing headlines covering a different story altogether. Keywords in Headlines This is not particularly difficult to do in Google, especially in Google News which supplies content to the Top Stories boxes you see in regular Google search results. The news-specific part of Google’s algorithms is focused on speed, i.e. surfacing recent articles, and therefore loses some of its accuracy in terms of topical targeting in favour of simple keyword matching. By having a relevant keyword in an article headline on an official Google News-approved publisher’s website, Google is likely to show that article in its news boxes – especially when the only alternatives are articles older than 48 hours, which is the primary window of opportunity for articles to show up in Google News. Google Steers The Public Debate It seems Johnson’s PR people have a keen sense of Google’s importance in steering the public debate, as it is among the primary sources of news for the general populace. Moreover, these PR people know how to play the game to their advantage, and have journalists at the UK’s major outlets dancing like puppets by serving up the right words to put in to their headlines. Wherever you stand on the morality of this tactic, it is effective. While those of us working in digital industries tend to be able to spot these efforts rather easily, most of the public won’t notice these shenanigans and will simply consume the headlines they’re shown. Basically, it’s an effective means of burying embarassing stories in favour of more innocuous articles. Smart use of language gets certain key terms in to headlines for Google to then show in their search results. You could possibly write off the first ‘boris bus’ attempt as a coincidence, but this latest instance seems to show a pattern of deliberate manipulation. Especially considering searches for the actual person involved in the scandal are diminishing, leaving an opportunity to claim Google search real-estate for less focused searches. All is fair in love and war, and UK politics is certainly in a state of war right now. Update: Folks have pointed out to me that this may in fact be the third such instance, as this one is somewhat suspicious too.
- My Digitalzone’18 talk about SEO for Google News
Last year I was fortunate enough to deliver a talk at the Digitalzone conference in Istanbul. Among a great lineup of speakers on SEO, social media, and online advertising, the organisers asked me to speak about my specialist topic: SEO for Google News. In my talk I outlined what’s required for websites to be considered for inclusion in the curated Google News index, and how news websites can optimise their visibility in Google News and especially the associated Top Stories box in regular search results. You can view the recording of my entire talk online here: Since I delivered that talk in November 2018, there have been numerous changes to Google News – specifically to how Google handles original content and determines trust and authority. SEO for news publishers remains a fast-moving field where publishers need to pay constant attention to the rapidly evolving technical and editorial demands Google places on news sites. If you’re a publisher in need of help with your SEO, give me a shout.
- How to do a Technical SEO Audit
Since late 2017 Andrew Cock-Starkey, better known as Optimisey, has been organising regular meetups in his native Cambridge where he gets SEOs from all over the world to come and give a talk. While the meetups aren’t huge, usually having a few dozen attendees, Andrew records the talks and puts them online for anyone to watch for free. It’s a great way to share knowledge around the SEO industry, so when Andrew asked if I wanted to come over and do a talk I couldn’t say no. Sharing my experience and expertise with the industry is important to me, as that’s how I learned much about SEO myself. Hence, earlier this year I made the trip to Cambridge and did a talk about my approach to technical SEO site audits. The video of that talk is free to watch, and I hope people find it useful and worthwhile: There’s also a full transcript available on the Optimisey website if you prefer to read text rather than watch a video. Make sure you check out some of the other Optimisey meetup videos, which includes awesome talks from people like JP Sherman, Marie Haynes, Jennifer Hoffman, Chris Green, Stacey MacNaught, Kevin Indig, and many others.
- SEO Strategies for Growth: One-Day SEO Training in Belfast on 18 September
I’ve been delivering specialised technical SEO trainings for a few years now, as well as countless bespoke SEO trainings for agencies and in-house teams. Now I’ve teamed up with Growth Marketing Live to deliver a special one-day SEO training as part of their conference, where I’ll teach SEO best practices that deliver lasting growth. This SEO Strategies for Growth training day is intended for marketers who want to learn how to apply SEO to enhance their business growth through organic search traffic. It’ll be a full day of training that is accessible and actionable. The goal is to empower the participants to apply what they’ve learned to their own sites straight away to help grow their traffic from Google search. All the relevant areas of SEO will be covered, from basic on-page SEO to linkbuilding and technical optimisation. These are the topics we’ll cover in the training: SEO in the wider Digital Marketing mix: where does SEO fit in compared to other channels such as paid search and social media. On-Page Optimisation: how to optimise your webpages for maximum visibility. Linkbuilding and Content Marketing: becoming a trusted source of information that Google can confidently rank high in its search results. Technical SEO Basics: ensuring your website can be properly crawled and indexed by Google. Structured Data Markup: how to enhance your content with schema.org markup to get rich search snippets in Google. Load Speed & Mobile SEO: optimising your website experience for mobile users. Crawl Optimisation: ensuring large scale websites can be efficiently crawled. International SEO: how to make sure Google ranks your international content correctly across the globe. The early bird price for this SEO training is £349, which also includes a ticket for the conference on the following day. Places for this special one-off training day are limited, so book your spot now on the Growth Marketing Live website. P.S. there are still a few seats left for my upcoming Technical SEO Course in Dublin!
- Preventing Saturation and Preserving Sanity
Over the past few years I’ve spoken at a lot of conferences. I’m not quite as prolific as, for example, the amazing Aleyda Solis, but there have been significant periods where I spoke at an event least once every month. I enjoy speaking at conferences. A large part of my enjoyment comes from sharing my knowledge and meeting with people in the industry. I get to hang out with old friends and make new ones, and the privilege of going up on stage to have hundreds of people listen to me is one I never take for granted. Thanks to conferences I’ve been able to travel to amazing places and meet up with awesome people. The past few years I’ve travelled to cities like New York, Las Vegas, Paris, Istanbul, Milan, Bonn, Amsterdam, and numerous places in the UK and Ireland – all thanks to events I was invited to speak at. But I also dislike going to conferences. The travel is never fun (I’m a grumpy traveller at the best of times), I rarely get a good sleep in hotel beds, and my nutrition takes the usual hit. I also feel a lot of pressure to deliver a good talk, one that entertains and informs and is hopefully worthwhile and unique. And then there’s the socialising bit. At heart, I’m an introvert pretending to be an extrovert. I’m not great at socialising but I make an effort, because I do enjoy hanging out with people I like – and fortunately the SEO industry has plenty of fun people to hang out with. I’ve made several great friends in the industry over the years, thanks to conferences and the surrounding social activities. But there’s only so much I can handle. My reservoir of social interaction is limited, and conferences drain that reservoir very quickly. I’ve been very lucky that my wife and business partner Alison joins me at many events, and helps make socialising so much easier for me. Contrary to me, she actually likes people in general and enjoys chatting to new folks. She’s been an incredible support for me over the years as our business has grown and my conference speaking gigs became more numerous and more international. All in all, despite the fun bits and all the support I’ve received, it’s been taking a toll on me. The travel, the lack of sleep, the pressures of delivering, the socialising, and of course the time away from actual paid work – speaking at conferences comes at a price, and it’s one I’m increasingly reluctant to pay. I’ve already agreed to a number of events for the remainder of 2019, and I’m genuinely looking forward to each and every one of these: Optimisey Cambridge SEO Meetup SMX Munich BrightonSEO eComm Live The Tomorrow Lab Presents Digital Elite Day Digital DNA SearchLeeds Nottingham Digital Summit State of Digital Chiang Mai SEO Some are events I’ve never spoken at but have wanted to, and others are recurring events that I always enjoy being a small part of. So I’m committing to these events and will work damn hard to deliver great talks at every single one. After that, I’m pulling on the brakes. For a long time I felt that speaking at conferences was a way to prove myself, to show that I knew my stuff and wasn’t half-bad at this SEO malarkey. The bigger the stage, the more I felt affirmed in my knowledge and experience. That aspect of it has lost its luster for me. I don’t feel I’ve anything left to prove. I’ve become increasingly confident in my own abilities as an SEO, and feel I’ve gotten a good handle on my imposter syndrome. Also, I sometimes feel that by speaking at a conference I’m taking up a spot that could’ve gone to someone else, someone who is still building their reputation or who has more worthwhile content to share. And, let’s be honest, there’s enough white guys speaking at conferences. If I take a step back from the conference circuit, maybe that’ll allow someone else to take a step up. So from now on I’ll keep my speaking calendar a lot emptier. I’m not retiring from the conference circuit entirely – I enjoy it too much – but I’ll be speaking much less often. I’ll be on stage at a small handful of events every year at most, and mainly outside of the UK (with one or two exceptions). This will hopefully free me up to focus on my paid client work, as well as my SEO training offering. And I’ll keep showing my face at events like BrightonSEO, as for me those feel more like regular SEO family gatherings. It’s a selfish move of course, to prevent my name from saturating the conference circuit as much as preserve my sanity. I feel I’m at risk of losing appeal as a speaker, as there’ve been so many opportunities to see me speak. Maybe by enforcing some scarcity, I’ll stay attractive for conference organisers while also making sure I can deliver top notch talks at the few events I choose. But foremost I want to prevent burning out. I’ve felt quite stretched the last while, always running from one place to the next while trying to meet deadline after deadline. It’s time I slow down the Barry-train and focus primarily on my client work. Conferences are great fun but they also consume a lot of time and energy. Those are resources that I need to treat with more respect. I’ll hope to see many of you at the 2019 events still to come, and I’ll do my best to stay in contact with my industry friends. Conferences are a great way to keep in touch, but definitely not the only way. Some of our best industry friends have visited us in Northern Ireland, and I want to make time to do the same and visit our friends where they live. Those are the trips that don’t cost energy, but recharge the batteries. I need to do more of those. So, in short, I’m not going away, but I’ll become less ubiquitous. It’s win-win for everyone. :)