Updated: 20 December 2017
One thing the SEO industry isn’t lacking is tools. For every SEO task there appears to be at least one tool that claims to be able to do it all for you. From site analysis to on-page optimisation, from outreach to content planning, you’ll never be short on tools to aid in your work.
But tools can be a crutch, an inadequate replacement for real skill and experience. SEO tools are only as good as the SEO practitioner using them.
There are hundreds of tools to choose from. Brian Dean at Backlinko has compiled a whole list of them – you’ll find dozens of tools there for every conceivable task.
But here at Polemic Digital, I only use a handful of tools; a few tried and trusted platforms that, for me, deliver all the value and automation that I require.
Google Search Console is such an obvious source of data that I won’t mention it here. Instead I’ll focus on my favourite 3rd party SEO tools that I use (almost) every day:
1. Screaming Frog
The Screaming Frog SEO Spider is the must-have tool for every serious SEO. The tool is basically a crawler that tries to find every page and asset on your site and extracts relevant data about it.
The tool has quite literally an almost limitless number of applications, from basic site analysis to load speed testing, XML sitemap checking, Google Analytics tracking code verification, etc. There are very few site analysis issues Screaming Frog can’t help you with.
The only downside is that the tool runs on your own computer, which can take up a lot of time and CPU power for large websites (unless you manage to get it running in the cloud). Additionally, while Screaming Frog has some useful built-in reports, generally the tool presents you with just the raw data and you’ll have to do much of the analysis yourself. Which brings me to…
I started using DeepCrawl at some stage in 2014, and I haven’t looked back since. It’s basically Screaming Frog on steroids, a SaaS crawler that allows you to analyse sites on a grand scale.
The major advantages DeepCrawl has over Screaming Frog are that it’s easier to crawl larger sites with it (and you don’t need to leave your computer running for it), and that a lot of SEO analysis is done for you through the tool’s in-depth crawl reports.
DeepCrawl allows you to quickly find SEO issues on a site, with very little manual analysis required. That is the tool’s greatest strength, but also its weakness. Manual analysis is sometimes crucial to finding the root cause of an issue – which is why I love using DeepCrawl in combination with Screaming Frog. It’s a match made in heaven.
The previous two tools are all about on-site SEO. When it comes to off-site optimisation – i.e. links – you need a tool of a different sort. My go-to software for link analysis is Majestic.
Aside from having the largest accessible database of links on the web, Majestic offers a host of other features which are exceptionally useful for link analysis. I especially like their Trust Flow and Topical Trust Flow metrics; mightily handy for outreach and spam detection.
The additional capabilities for bulk link checking as well as comparing link profiles of competing websites, make Majestic my number one link analysis tool.
Speaking of competitor comparison, there’s one tool that stands apart from everyone else when it comes to competitive analysis: Sistrix. With this tool you can get a very good snapshot of a site’s performance in search results, and compare that to those of its rivals.
The data is quite good and will give you a reliable impression of a site’s footprint in search results, and any shifts associated with algorithm updates. You can also compare multiple websites, allowing you to see exactly where one is gaining at the expense of another.
Sistrix also has a host of other features which can help with all kinds of other aspects of SEO, including site audits, keyword research, and rank tracking. I’ll be honest and admit I don’t use those, as I prefer specialised tools for those aspects of SEO.
5. Little Warden
When Dom Hodgson launched this tool in the middle of 2017 I was keen to give it a try. Little Warden is a monitoring tool that checks a domain and homepage for a huge range of technical aspects, such as:
- Domain name & SSL expiration
- Title tag & meta description changes
- Robots.txt changes
- Canonical tags
- 404 errors
- and many, many more.
Little Warden sends you a notification every time something changes, so that you’ll never let a domain name expire or have a robots.txt disallow rule change pass unnoticed.
You can configure the checks as well and choose which checks you want to enable or disable. So far Little Warden has been a huge lifesaver several times already, notifying me of potential problems such as expired SSL certificates, title tag changes, wrong redirects, and meta robots tag problems. A hugely useful tool if you manage a varied client roster.
Because I work with several large news publishers, I need specialised tools to analyse a website’s visibility in Google News. This is where NewsDashboard comes in.
Where Sistrix keeps track of regular search results, NewsDashboard monitors Google News. There are many different ways in which Google shows news results, both in the dedicated Google News vertical and as part of news boxes in regular results on desktop and mobile. NewsDashboard monitors all of these, and provides visibility graphs showing how different news sites perform over time.
Additionally, NewsDashboard can also be used to see what trending news topics a website is covering, and what topics it isn’t showing up for. The latter is very useful data to give to a newsroom.
Since site speed is such a crucial aspect of (technical) SEO, it’s important to use the right tools to measure a page’s load speed. Specifically, you want a tool that gives you the right recommendations for improving load times.
I’ve found the standard PageSpeed Insights tool from Google to be somewhat lacking, so I started using GTmetrix instead. It has all the advantages of PageSpeed Insights and provides additional speed checks, showing a clear overview of the load speed issues that affects a page.
There’s also a waterfall view like you get in WebpageTest.org, and you can create a video showing a visual recording of a page’s load time. Best of all, it’s free to use. GTmetrix has quickly become my go-to load speed testing tool.
Keyword rank tracking is a controversial metric. Many SEO professionals claim that keeping track of where your site ranks for a given keyword is useless. Those SEOs are wrong.
Rank tracking is incredibly useful and, in my opinion, a crucial element of SEO reporting. I’ve written about that before so I won’t go in to it again. Suffice to say, I keep track of keyword rankings, and it helps me a lot in my SEO efforts.
My rank tracker of choice has varied a lot over time, but since I came across SERPWoo (recommended by Aleyda) it’s become my default rank tracker. I love how the tool keeps track of all the top 20 results for a keyword, allowing me to spot ranking flux and possible algorithm shifts.
SERPwoo also provides global SERP fluctuation graphs which are very useful when trying to keep track of major Google algorithm updates.
The five tools listed above are the ones I use most often, but don’t represent the full extent of the arsenal of tools at my disposal. There’s plenty of other tools I rely on for bits and pieces, such as Kerboo, SEMrush, Screaming Frog Log File Analyser, and BuzzSumo.
There’s one tool I haven’t yet mentioned that prize above all others: critical thinking. When you become overly reliant on tools, you lose the ability to analyse SEO issues properly, and you’ll start missing things that tools might not necessarily be able to spot.
In SEO there are no more shortcuts. No tool in the world is going to turn you in to an SEO expert. Tools can certainly make some aspects of SEO much easier, but in the end you’ll still have to do the hard work yourself.