Attracting links to your website using good-quality content is crucial to search engine optimisation (SEO), but it’s worth remembering it’s immaterial if your site is not technically healthy.
Your website needs to be set up so it’s easy to crawl for the Googlebots, while also optimising for users at the same time. Often, even the smallest on-site changes can have significant impact on your search rankings, so they should be at the top of your to-do list. If you aren’t already, here are five steps you should take to optimise your on-page SEO:
1. Don’t over-optimise
Stuffing keywords into your page title, meta descriptions, menus and internal link titles, or in the body of your content, is considered Black Hat 101, but amazingly it still occurs. Google does not take kindly to websites that appear to be purposely over-optimising for specific search terms, and by doing this you run the risk of having a keyword rankings filter applied to your site. This will prevent your website from ranking to its true potential.
Hidden text is another common form of over-optimisation. This Black Hat practice has moved on from white text on white background to include hiding text using CSS templates which reveal text behind drop-downs.
Doing this purely for SEO purposes should be avoided, as should hiding vast swathes of text behind drop-downs at the bottom of the page, as it’s unnatural and of no use to the user.
Page titles at an individual level are probably the most significant on-page factor. Avoid repeating variations of your head terms in your title tags and consider how they will appear to the user. The page title usually appears in the search engine results pages (SERPs) as the hyperlink you click to visit the site, so it should be conversion focused, logical and interesting for a human to read.
2. Avoid duplicate content
There are scenarios where duplicate content is required for legitimate reasons: your site may have localised language pages (for example, UK and US English) or pages that are accessible by users and search engines via multiple complex product paths with different URLs.
Search engine bots attempting to index your site also have a problem with duplicate content that is caused by tracking parameters and session IDs appended to URLs.
Duplicate content will harm your search rankings, so you should identify all duplicate pages. If they are required for genuine reasons, use canonical tags to highlight this.
Canonical tagging provides you with complete control over the URL that you wish to be returned in search results. Essentially, they inform Google where the original or preferred version of your content can be found and notify the search engines to ignore any duplicated pages. They also ensure any link popularity is kept to your preferred page.
3. Check your backlink profile
This is the section where those people who have indulged in bad link building practices previously begin to feel uncomfortable. There is no escaping your past SEO discretions, however. Identify the sites that are linking to your website, and if you ‘discover’ a large volume coming from low quality sites, link or article directories, you need to get these removed.
If you’ve been mixed up in Black Hat practices previously, then the chances are these kind of links are going to number in the hundreds, probably thousands – so it’s not plausible to remove every single one. You should, however, document your attempts. And, if all else fails, you can submit your request via the Google Disavow Tool.
If your website is linked to these low-quality sites, then you run the risk of being judged guilty by association by Google. If your backlink profile is not overly unhealthy, don’t panic. There have been far too many people throwing the baby out with the bath water when it comes to link removal and asking for perfectly good links to be removed. This is only going to be detrimental to your rankings.
Avoid blanket emails to webmasters, and instead thoroughly investigate your backlink profile and accurately identify what are bad links and what are good – links are not bad, only links from bad websites are bad.
4. Redirects and footer links
Links at the bottom of the page will be devalued pretty sharpish by Google, so avoid putting too many in. The footer of a page – usually reserved for privacy statements, legal information, etc. – should only be for secondary navigation. If a link is required, it should be higher up in the page, as this will be crawled first and will be more beneficial in terms of SEO.
Another common problem is redirects. When new pages are being created to replace old ones, ensure that the old page redirects traffic with a 301 permanent redirect.
5. Speed up your load times
We all know how annoying it is waiting for a slow website to load up, watching either that blue circle chase itself or being tormented by an egg-timer on your screen – well, Google feels the same way.
Search engines use site speed as a ranking factor. Ultimately, search engines want the user to have the best experience possible, and site speed plays a big part in this.
To increase your load times, assess the file size of any images on your site and reduce these as far as possible. Also, reduce any irrelevant ‘junk code’, and don’t use too much display advertising, as these will both slow down your site.
Jim Kirk, Senior Consultant, Goldmine Media
Join now and receive Goldmine Media’s FREE email newsletter.
Sign up to Goldmine Media’s emails and receive FREE marketing guides, special offers and promotions.
Subscribers will receive our regular Goldmine Media email newsletter crammed with ideas to help you improve client communication, raise brand awareness, improve marketing efficiency, enhance client retention and increase sales.