The problem with SEO is that it is often controlled by marketers.
Marketers aren’t inherently bad people, but when you get a bad one (of which there are many) any information you receive around SEO is going to be filled with buzzwords and soft outcomes.
From a development point of view SEO is the concern of how well a robot can read and understand your content. As we will see, a robot being able to read your content easily is normally a good thing for humans too.
The following sections are going to explain several topics that are clearly within the developer’s remit and a good understanding of their impact for both humans and robots will help in any project you work on.
Site Speed
How fast your site loads and is perceived to have loaded is a highly technical challenge.
Assets need to be as small as possible for transmission and maintain a high quality.
You should care about how many network requests are being made per page load.
You need to care about perceived page load, so getting content onto the screen as quickly as possible.
The order things come down the network at is important.
A global internet means not everyone is accessing your site on a broadband connection.
Mobile internet means you can’t guarantee the transmission of data will even complete if it takes several cycles.
Why Site Speed is good for SEO
Site speed has been listed as one of Google’s ranking factors. Naturally the faster the site the higher potential score you will get for this one part of their algorithm. According to Moz’s breakdown of website speed and ranking the key factor is the time it takes for the first byte of information to come across the pipes.
If a search engine’s crawlers can download the contents of your page quickly it is going to do it more often than if it takes seconds per request.
When people are researching for an article they are writing, they are more likely to stick around and read a page that responded quickly. This means your content is being absorbed by more people and has a greater chance to be linked to by someone.
Why we should care about Site Speed anyway
Even if you don’t care about SEO you can’t argue that slower is better, there are several studies showing that faster page loads are better for everyone. Take this KissMetrics writeup for example.
Slow speeds can be an indicator that there is a query that is taking too long or a memory leak happening somewhere, if so your site may not be using the resources on your server efficiently and you may be spending money on a package you don’t actually need.
Redirects
Redirects are the hoops that your server jumps through when a browser asks for a page at a particular URL but knows it lives at a different location. There are several things that need to be considered:
Over the lifetime of your website potentially thousands of other sites will link to pages that you had long since forgotten about.
You can do redirects at various levels, each one comes with maintainability issues.
If done wrong can have a negative effect on your site.
Can be broken for months before someone notices.
Each redirect has an implied latency.
Why Redirect are good for SEO
Search engines like there to be one canonical place for everything, so if you have two paths that lead to the same content this is confusing for them.
If instead you say that anytime someone types https://www.mysite.com/my-page and you change it to https://mysite.com/my-page automatically then the search engine doesn’t have to worry about several places.
This comes into play heavily when content moves completely, perhaps between domains. Doing redirection well ensures that any past page authority is transferred to their new home.
Why we should care about Redirects anyway
Nobody likes dead links, this can easily happen when something major about the structure of your site changes (domain name, internal structure).
If a user goes to your site and gets a 404 they are not going to try subtle variations of the URL in order to get to the content, they will go onto the next site.
Even if the link isn’t dead, people don’t like jumping between 5 different URLs before getting to the content. If done poorly this can result in multiple network requests which is inefficient.
Status Codes
Status Codes are the codes returned from your server after a request has been made, as a developer you need to make sure you are returning the correct code at any given moment.
If you return a status code of 500 but meaningful content still is returned, will a search engine index it? Will other services?
Search engines care a lot about the 3xx redirection status codes.
If you have used a CMS to build your site it sometimes isn’t apparent what codes are being used where.
Why Status Codes are good for SEO
The status code returned is one of the primary things a search engine has to know what to do next. If it gets a 3xx redirect notice it knows it needs to follow that path, if it gets a 200 it knows the page has been returned fine, etc.
Making sure all your content is returning on the 200 code and all your redirects are appropriately using the 301 code means search engines will be able to efficiently spider and rank your content.
Why we should care about Status Codes anyway
We should care about status codes anyway because search engines are not the only thing that might care about the content on your site; browsers, plugins, other sites (if you have built an API) all could potentially care about what code is returned.
They will behave in ways you might not expect if you return invalid or incorrect codes.
Semantic Markup
Semantic Markup is markup that has inherent meaning associated with it, a simple example would be to know that the <h1> element is going to be the overarching heading for the section you are in.
There are some very subtle things that should be considered when choosing markup
When content should use elements like <aside>, <nav>, <blockquote>, <figcaption>, etc.
When does it make sense to additionally use semantic attributes, for example those suggested by schema.org.
Be prepared to make CSS changes to accommodate the default styles, remember there is a difference between design and function.
Don’t just use elements like <section> because you can in place of a <div>. You have to realise that all elements come with an inherent semantic value (even if that is to state “I have no semantic value”).
Why Semantic Markup is good for SEO
Semantic Markup is excellent for SEO because you are literally giving the content on your page meaning that a search engine can easily understand.
When you use the schema.org suggestions for a review, search engines will know that when you say 3/5 at the end that what you mean is you have scored it 3 out of 5 and will potentially show those amount of stars on their search result page.
Semantic markup lets you group and link content. The old way of thinking was that a page could have one <h1> element, and that was normally reserved for the name of the site. Now because of the likes of <section> and <header> we can have grouping that make sense. This means search engines can have a much easier time of parsing longer articles.
Why we should care about Semantic Markup anyway
We should care about this anyway because search engines are not the only things looking at our site. Assistive technologies such as screen readers can use semantically marked up documents a lot easier.
For example, when you markup content with an <aside> element some assistive technologies know to leave that out of the main content when reading aloud to a visually impaired user.
Maybe your user can’t concentrate on large articles with lots of information. By semantically breaking down this information they can clip what they need to view how they like to view things.
Search engines aren’t the only robots out there looking at your site. Other services could hit your site and look for the likes of a CV, if you have used the correct markup and semantics that would be an easy task.
URL Structures
URL Structures are what you see when you look in the address bar, so they could be something like mysite.com/my-awesome-page/ or they could be mysite.com/?p=233432
Getting these structures right requires some thought and some technical knowledge.
Do I want to have a deep structure like site.com/category/theme/page.html.
Are the structures consistent across my site.
Are the structures meaningful to anything but the site’s code.
Is there a logic to them that a new developer could follow and add to.
Why URL Structures are good for SEO
A good URL structure is good for SEO because it is used as part of the ranking algorithm on most search engines. If you want a page to rank for “purple beans” and your URL is mysite.com/purple-beans/ then search engines will see that as a good sign that the page is going to be dedicated to the discussion of purple beans.
The URL will appear in search results, if it makes sense people are more likely to click on it than if it is a jumble of IDs and keywords.
A good URL will serve as its own anchor text. When people share the link often they will just dump it out onto the page, if the structure makes sense it will allow your page to rank for those terms even without someone setting it up correctly.
Why we should care about URL Structures anyway
Outside of the context of search engines, we encounter URLs all the time and as users of the web we appreciate it when things make it simple.
Your users will appreciate it when they can look at a URL coming off your site that just makes sense, if they can look at a URL and remember why they have it in a list without needing to click into it, that is a big win.
Over the lifetime of a website you will be surprised how much of your own admin you will need to do that will involve you looking at the structure of the URLs. If you have taken the time to do them right it will make your life much easier.
A note about JavaScript and SEO
I wanted to end by mentioning JavaScript very briefly.
A lot of websites you will create are going to be JavaScript driven or at least rely on it very heavily. There are various schools of thought on if this is a good thing or not but the fact is JavaScript has happened!
It used to be that search engines couldn’t even follow a link that was using a JavaScript onClick function, they have come a long way since then and can do an excellent job of ranking sites that are completely made in JavaScript.
That being said search engines are not perfect at this task yet so the current advice still has to be that if you want something to be seen by search engines then you should try and make sure there are as few things blocking them by seeing it as possible.