Search Engine Optimisation can be a rather confusing concept to understand and implement, however once you are comfortable with the basics, you can become proficient in the field in no time.
SEO is broken down into two main categories, On-Site SEO and Off-Site SEO, these two categories have very different requirements and also different strategies. On-Site SEO is the process of optimising your website content for search engines to be able to deliver the best results possible, this includes well written meta tags and titles, quality content that is updated regularly without breaking any rules or standards set by Google.
1. Title Tags and Keywords
Your pages Title tags are one of the most important elements of your site’s code with regards to SEO. The Title tags should be written so they include the most important keywords of your page. For example, “On-Site SEO Tips” is more likely to rank for the keyword phrase “On-Site SEO Tips” than just “SEO Tips”.
It is known that web pages that contain targeted keyword(s) within their title tags generally rank higher in search engines than those which don’t. This is because search engines understand the content and topic of a page using its title tags. The correct keyword(s) embedded in title tags helps search engines to associate the page with the search query and can result in your page ranking higher.
Keywords should be implemented somewhere near the beginning of the title tag. Keywords that match a search query will appear bold in the search engine results page (SERP) and it is widely believed that this practice helps to improve click through rate (CTR).
If you are using WordPress for your website there is a useful plugin called Yoast SEO which helps you to manage your page Title tags.
2. Title Tag Character Length
Getting the length of your title tag correct can be crucial. If it is too long then the search engines will not show it in it’s entirety within the search engine results.
The first 50-60 characters of a title tag are usually able to be seen by most desktop and mobile browsers, according to Google. If you keep your titles below 60 characters, our data shows that you’ll likely have 90% of them display correctly in the SERPs.
There’s no such thing as an exact character limit because characters vary in actual pixel width. When Google crawls a page, it will look at the entire title tag when assessing human readability and comprehension. Even if the full title tag is not displayed in full in the SERPs, Google’s spiders are thought to consider the entire title tag.
3. H1 and Keywords
Keywords within the H1 are a must for SEO and are usually synonymous with ranking well. Many SEO consultants believe that placing keywords in the H1 can boost search engine rankings. This is certainly my viewpoint, and has been something I have been doing for many years.
Your H1 should include your most important keyword target and match what your Title tag is targeting. For example, “On-Site SEO Tips” should have the keywords in H1 as On-Site SEO.
4. Meta Description Tags and Keywords
Meta Description tags are used by search engines to give searchers an idea of what your web page is about. The meta description should include your keyword phrases (if possible) and be written for humans.
Meta descriptions are widely looked at in terms of SEO, but they do not have a direct impact on your rankings in Google or other search engines. They can be used to understand what the page is about and encourage searchers to click through when it’s relevant for their query.
Search engines use meta descriptions when displaying search results.
5. URL and Keywords
One of the most fundamental On-site SEO tips is URL structures rich in keywords, URLs should be written in such a way as to correctly target your keywords. In URLs, spaces should be replaced with dashes and the words should not form one lone string of characters.
Good URL example: www.example-website.com/seo-tutorials/on-site-seo-for-beginners
Bad URL example:
In some content management systems which dynamically generate page URLs you may also see a series of numbers and special characters within a URL. These are also not deemed SEO friendly.
In WordPress these URL’s are determined using the “permalink” settings, WordPress also refers to the URL’s as “slugs”. The most useful permalink settings in WordPress is %postname%.
6. Content and Keywords
Well written content for your web pages is vital, it should be grammatically correct, have no spelling errors and be of sufficient length or wordcount to accurately describe your subject matter. Adding keywords within the written content of your pages is vital for them to rank, these should be your primary keywords and also any related terms and synonyms.
Be careful not to overuse your keywords within your content as it may start to read unnaturally and look spammy. When writing content you should always consider the reader rather than the search engines.
There are a tow things to consider when trying to include keywords within your written content, density & proximity.
7. Keyword Density
Keyword density is a representation of how often a keyword appears within the overall content of a page, it is usually determined by a percentage. Many SEO consultants recommend a density of no more than 3%, meaning the remaining 97% of the words on the page shouldn’t be keyword targeted at all.
Let’s say that I want to rank for “web design”:
“Web design requires many skills. Web design comprises of many elements. Web design areas include SEO, graphic design and coding. Many people want to make their website better through better web design.”
This short paragraph has 32 words, The term “web design” appears with a density of 16%. As you can see from this example it doesn’t read very well at all.
A useful tool for checking your keyword density is available by seoreviewtools.com where you can either specify a URL for the tool to check or copy and past a block of text into it.
If you are using WordPress and Yoast SEO their plugin contains a density checker within it and will highlight any issues for you prior to publishing. Yoast have also written a great post about the subject here.
8. Keyword Proximity
Search queries are usually carried out using multiple words making up a search phrase. The proximity of a keyword is determined by how many other words make up a phrase, and how they are positioned within the content of your pages.
For example, if a user were to search “web design derby” and on my page I have a sentence which reads:
“Andy Morley is a web design and development specialist based in Derby” would see the proximity between “web” and “design” as 0 as they are placed side by side. The proximity of “design” and “Derby” is 5 as there are 5 words separating the two words.
Higher proximities are thought to be less relevant to search queries, the closer your keywords are to each other in a sentence or paragraph the more relevant they are to each other. However this can be quite difficult when writing content which is both grammatically correct and reads well for the reader.
9. Word Count
One of the most vital On-site SEO tips is word count. Search engines tend to like longer form content, the majority of pages which rank for a given term usually have a substantial word count. There’s no official guideline as to how many words should be on your pages though, use as many words as necessary to get your message across without going overboard and droning on.
When writing articles I tend to aim for wordcounts of around 1500 words, especially if the article is quite in depth. Pages with very little wording are said to be “thin” in content, however thin can also refer to lack of video and images on a page.
One of the first things I do when starting a new SEO project for a client is to audit their site content, making notes on any pages where their wordcount could be expanded. I have even managed to significantly improve client’s search rankings by simply adding more words to their pages… It can be that simple.
For blog writing, the easiest way to produce a blog post with a high word count is to use the “listicle” structure, which this post is an example. The post you are currently reading has roughly 3700 words contained within it.
10. Image “Alt” and Keywords
An image “Alt” is generally used for a brief description of an image should the image fail to load correctly. It is widely accepted however, that an image “Alt” can help a search engine determine the subject of a webpage. Inserting keywords into these short descriptions can help with the on-site SEO of your pages.
There is also an image “Long Description” however these seem to be less common online than they used to be, many CMS’s don’t seem to use these. In the past when hand coded websites were the norm, the long description of an image could also be used to help target your keywords. The original use for image Alt attributes and their “longdesc” attribute were pretty much the same, to accurately describe the content of an image.
11. Internal Linking
Internal linking is a key element in an SEO strategy, these are links from one page on your site to another. The anchor text for these links can be a huge influence on your rankings. Use keywords where possible to link to another page, these keywords should ideally be the primary keyword target or closely related to the destination page’s content.
Internal linking is a great signal for the search engines as to which pages should rank for a specific keyword. Google search console even provides you with an internal links report so you can see which of your site pages have the highest number of internal links pointing towards them. Your most important pages should have the highest number of internal links.
12. Unique Page Titles and Meta Descriptions
I mentioned earlier that your page title and meta description are used to describe to the search engines what the subject or topic of your page is about. If you have multiple pages with the same title or description it may cause issues when the search engines index your pages, they won’t be able to accurately determine which page on your site is intended to rank for which keyword. This can result in fluctuations in rankings where Google displays different pages within its search results each day.
This problem is referred to as keyword cannibalisation, it can also occur if multiple pages have content which is similar to each other, a service page and a blog post for example. I wrote an in-depth article about canibalisation in SEO on my agency website.
This is the reason you should always make sure your page titles and descriptions are unique on each page.
13. NoFollow your outbound links
Linking to other pages or websites from your page content is said to help your SEO, provided that you link out to reputable sources. These links should be used to support the subject you are writing about, providing your reads with another source of information to validate what you are saying. Links between websites is fundamental to SEO, specifically receiving inbound links from other sites is a key factor in improving your own rankings.
If you are linking out to another website it is recommended that your mark these outbound links as “nofollow”, there are a few reasons to do this.
- Adding Nofollow is an indicator for the search engine spiders to remain on your page and ignore a link to another site, retaining these spiders on your site.
- Nofollow should be applied to a link which points to a lower quality website, one which you can’t vouch for.
- Google introduced the Nofollow attribute to combat link building spam, if you don’t want your content to appear spammy you should consider linking out using a Nofollow.
Nofollow links are thought to have little SEO value as Google has stated that “generally these links aren’t followed” or crawled. So if you think about attracting links into your website, only followable links are likely to help your SEO in broad terms (however if all your inbound links were followable it looks unnatural and could result in problems with your SEO all on it’s own). Its always better to get a balance of both nofollow and followed links when looking at your own website’s link building activity.
14. Load Speeds
Website load speed has become more and more important in recent years. over 50% of all searches within Google are performed on a mobile device. Smartphones using the 5g networks need to be able to quickly load your website in order to display your content to potential customers. If your website is slow to load users will likely get fed up with waiting and likely head right back to the Google search results and find another website, quite possibly one of your competitors.
Load speeds can be determined by many different factors, all of which are related to your “on-site seo”, how your site has been coded, which CMS it uses even the webserver it is hosted on. However by far the most common issue with slow websites is the size of the photos and images on the pages. With size I’m not necessarily referring to physical pixel dimensions, rather the actual file size in kilobytes of the images.
The amount of data which a user has to download to display your website is vital to ensuring it is quick to load within their browser, especially on mobile devices.
By tackling any aspects of your site which could be slowing it down is vital, thankfully there are two main tools available to site owners to help them understand what needs fixing. Google Page Speed insights and GT-Metrix, both of these provide reports on what needs to be fixed.
Site speed optimisation is one of the most common things when it comes to on-site SEO. If you wish to read more about speeding your website up I have written another article here, and specifically wrote about image sizes here.
15. 404's and 301's
404 errors can be detrimental to your search rankings, a 404 error is generated by your hosting server when a specific page URL cannot be found, commonly known as a “page not found error”.
If Google has indexed your website and has a list of the page URL’s which feature on your site it will then display this URL for any searches matching the content on that URL. If these URL’s were to change then anybody visiting your site from a search engine will get this page not found error on your site. The likelihood in this instance is that they would go back to Google and find another website instead. This is obviously an issue, not just with your SEO but also for your users.
These 404 errors should be closely monitored and rectified once found, the most common way to do this is by implementing a redirect of your old URL to the new one. There are two types of redirect you should consider, first is a 301 or permanent redirect (which is the most useful for SEO) or a 302 temporary redirect, the search engines handle these differently so you should choose the correct method when fixing your 404 errors.
If you were to implement a 302 redirect, Google will keep the old URL in it’s index as you are telling it that this is a temporary error and the old/not found URL will soon be back online. However, if you use a 301 redirect, Google will drop the old URL from it’s index and replace it with your new URL.
Monitoring 404 errors is fairly easy, Google Search Console will tell you which pages it knows about which have resulted in a 404 error.
If you are a WordPress user, there is also a really useful plugin simply called “redirection”. This plugin helps you identify these 404 errors within your own site and allows you to quickly set up 301 redirects to fix the issue. It is important to note that for any 404 errors on your own website, you should also try to update any internal.
Setting up URL canonicalisation can be important for SEO in a way to handle duplicate content, it is done using a Canonical attribute in your page code.
A canonical tag (sometimes known as “rel canonical”) is a method of informing search engines that a specific URL is the primary version of a page. The use of the Canonical tag prevents duplicate or similar content from being listed on multiple URLs. The Canonical tag can be used to prevent duplicate content and determine the correct URL to index.
These 4 examples above all represent the homepage of a website, all 4 URL’s will show exactly the same content, we need a way to tell Google which version of the homepage should be indexed, this is where the rel=”canonical” element helps.
What is a robots.txt file?
The robots.txt file is a small text file that allows you to ask search engine crawlers not to crawl, or “index” parts of your site. This can be useful when there are areas on your site you do not want indexed but still visible to users, such as a shopping cart or administrative panels.
The robots.txt file is placed on the webserver and provides instructions to the search engines on how to handle specific areas of your website, any areas which shouldn’t be indexed should be marked as “disallow”. In the below example we are telling all “user agents” (crawlers such as Google bot) not to index any pages within the administration area or uploads directory.
If you are using a CMS such as WordPress this txt file is automatically set up for you, however if you are coding a website by hand you will have to write one yourself.
18. XML Sitemaps
Search engines need to be able to discover your website pages before they can be added to their index.
The search engines will crawl the content on your pages and take not of all the links to your other pages, this is one method that they discover all of the pages on your site. However there is a problem we touched on earlier in that as soon as the crawler hits an outbound link it will shoot off to another site and start crawling that one instead of your own (unless the links are marked as nofollow). This can casue issues in getting your site fully indexed.
One solution to this is to set up an XML file which contains all the URLs of your pages within it, this file is then submitted to the search engines in order for them to visit each URL and index them. With Google, this is done via Google Search Console. Not all CMS’ will generate this for you, Shopify and Magento do, however WordPress doesn’t. An additional plugin such as Yoast SEO would be required in order to generate an XML sitemap containing your page URLs, this can then be submitted to the search engines to encourage them to index your pages.
19. SSL Certificates
An SSL certificate or secure socket layer certificate is used to authenticate your website and create a secure connection with the web browser. This helps to prevent data leakage and helps to keep your information secure, this is extremely important for transactional websites.
In the past all ecommerce websites were expected to use an SSL certificate to help keep user’s card details safe ad secure, however in early 2018 Google began to roll out an update to the Google Chrome browser which would flag up any website which isn’t secure and using an SSL. This effectively forced website owners to adopt their usage on all websites.
Redirecting old non secure http: URL’s to their secure https: versions is required, this is usually done on your hosting account when installing an SSL certificate. However you should also set this up in your .htaccess file on your server if its a PHP / Apache server (most common).
20. Geo Targeting using Image EXIF data
If you are hoping to attract local customers from the search engines then setting up local SEO is a must. If your website is based in a city and your customers are also in that location your would want Google to rank your site every time somebody uses your city or town name in a search query or they are searching from within your geographical location right?
There are a number of ways you can geographically target your website. The first is the HREF Lang tag, this broadly tells the search engines which country your website services, there is also your Address and Postcode which should feature on your website, commonly the contact us page.
Finally there is more advanced location targeting available within the meta data of your website, specifically the images on your site.
When a photo is taken a digital camera will add meta data or EXIF data to the image such as date and time the photo was taken but it will also add geographical co-ordinates to the data.
This data is useful for Google in determining where you are located, this is quite a technical aspect of SEO and relies on you using your own photographs. Another issue is that once you edit your photos in Photoshop or other editing tools to make them small enough to use on the web (see the site speed section above) quite often this location EXIF data is stripped out.
There are settings within Photoshop specifically to preserve EXIF data of your images or even add EXIF data to your images if you like.
This article discusses 20 tips for onsite SEO. It covers a range of tips which you can use when optimising your own website for the search engines. Some of these are fairly straightforward and you should be able to tackle them on your own. However if there are any which you don’t fully understand or would like some help with I would be happy to have a talk with you.