How To Find and Fix Common Technical SEO Issues
Time for another round-up. In January I did a webinar for Webpromo.expert in which I discussed how to find and fix common technical SEO issues. This is a hot topic at the moment as there seems to be a renewed interest in the technical elements of successful SEO.
The webinar was recorded and the video and slides are embedded below:
State of Digital: How a Hacked Website Led To a Wrongful Google Penalty
In my latest post for State of Digital, I show an example of how Google wrongfully de-indexed an entire website after it had suffered a brief security breach. Google acted quickly and without due diligence, and heavily penalised a website for serving malware code that was only live on the site for less than 4 hours.
Theoretically, thousands of hacked websites could have been penalised, leaving site owners in the dark about what has actually happened. If Google knows it’s hard to make the distinction between hacked sites and cloaking attempts – as John Mueller admits – then why do they still hand out these penalties? Wouldn’t it be infinitely more preferable to err on the side of caution and send a Security Issue warning message instead?
Swipe Summit 2016: Why I’m Fed Up With Sh!t Websites
At the start of February I spoke at the 2nd annual Swipe Summit in Dublin, where I delivered a rather ranty talk about why I’m fed up with crappy websites. The slides are embedded below, but they come with a warning: some of the content is potentially offensive.
My talk apparently went down like a storm, and featured heavily in conference roundup articles like this one from Emarkable.
Lastly, I contributed to another expert opinion roundup post, this one about Penguin factors. My answer is included below, and you can read the other 74 expert opinions in the article linked above.
“In my experience, sites that are Penguinated suffer from the following issues:
- Over-optimised anchor texts on one or more keywords;
- A relatively large number of questionable links that aren’t ‘hard’ spam but are nonetheless low quality;
- Lack of quality signals such as well-written content, high quality links and citations, and a strong overall brand presence on the web.
So when websites suffer from all three of these factors, I consider them at an elevated risk of getting caught in a Penguin filter. Having said that, a Penguin refresh is low on Google’s list of priorities at the moment, and I don’t expect the highly anticipated refresh to make much of a difference to most websites.”