24 Fantastic SEO Techniques to Boost Search Engine Ranking
Search Engine Optimisation is one of the most important aspects of online marketing and is essential in creating a strong search engine natural listing.
SEO focuses the theme of the website, so that search engines can recognise its content more clearly and therefore someone searching can find what they want more easily.
SEO also speeds up the ranking process, by using various methods to tell the search engines what the website is about.
Most businesses use a Search Engine Optimisation company to run their SEO strategy as it is a specialised and full time job.
We’ve compiled this article to help you in your SEO efforts, you can also view this SEO Tips article for further advice.
There are various techniques for Search Engine Optimisation, depending on the type of website or online business.
SEO strategy revolves around the type of business to market, as the business may only want to cover a distance in miles around its location, or it may want to target the UK or international business.
There are two ways of running an SEO strategy, either the correct way, known as “White Hat” or the incorrect way, known as “Black Hat”.
White Hat SEO involves using methods approved by search engines, but Black Hat, uses certain techniques to try and trick search engines into ranking a website higher.
Black Hat SEO runs the risk of your website becoming banned by the search engines, so it is not advised to use it.
The following are the main White Hat techniques used by Search Engine Optimisation experts:
Link Building – Link building is a process of receiving links back from another website. The more quality links a website has, the more important search engines see the website and therefore improves the ranking of the site. Quality back links come from content related websites and quality is always better than quantity;
Search Engine Submission – Websites will eventually be picked up naturally by search engines, however it is better to submit the website to as many search engines as possible, with the most important being Google, Bing and Yahoo;
Keyword Research – Having the correct keywords on a web page is the most important part of SEO. Without the right words, no one will find your website, so getting these right and unique per page is vital. Also, the density of these keywords must be taken into account;
Keyword Density – The density or amount of the same keywords on a web page is important. Too many of the same keywords will be seen as spam by the search engines and as an attempt to try and trick them. Therefore, by getting text to near 5% density per keyword or phrase is preferred by the search engines;
Geo Targeting – Geographical targeting is a method of adding town / county names in order to target a specific location. Some businesses with local services only want visitors from an area outside their location and many people will search by Keyword + Location, such as “Electricians In Southampton”;
Code Optimisation – The code within a website should be optimised by cleaning any errors found by the W3C Validator, any redundant code should be removed, links and tags should be correct and any code which could potentially be blocking search engine spiders should also be fixed;
Duplication Issues – Many website have issues with duplication, where a home page can be duplicated many times. An example of this could be the BBC website, where the home page could either be http://www.bbc.co.uk/,http://bbc.co.uk/, http://bbc.co.uk/index.html, or http://www.bbc.co.uk/index.html. This leads to 4 versions of the same page and means multiple versions of the same website. Search engines try to manage this by either ignoring one of the versions, but websites can end up have a mix of the different versions of the website. Also any links coming into the website may be diluted by the multiple versions. Other forms of duplication, such as duplicate text can cause problems as search engines will see this as spam and if this content is on another website it is seen as plagiarism.
H1, H2, H3 Tags – H tags, or heading tags, are used in the code of a website to highlight or embolden headings or specific words. The words contained within H tags are important and should contain page related keyword text;
Social Networking – Social networking and bookmarking websites have a high amount of traffic, so submitting a website to sites such as Stumbleupon, Delicious, Facebook etc can increase the amount visits to the website and therefore increase the amount of natural linking;
Alt Attribute– Every image on a website should contain an “Alt” attribute, which displays text, when an image can not be displayed. Each image should therefore contain keyword rich text relating to the content of the image;
Avoid Flash – Try to avoid using Flash for navigation, as search engines struggle to crawl flash, so therefore less pages in the website will be indexed;
Meta Data – Meta Data is source code which is used by the search engines. The Meta Description is the most important to populate with meaningful text about the page, but also include highly searched keywords;
Canonical Tags – Canonical tags can be used to rectify duplication issues, as the tag states which version of the page should be indexed by the search engines. This should be used with caution however as this tag can drop many pages out of the search engines index, therefore traffic can reduce quickly;
Page Indexing – The search engines record how many pages a website holds, known as indexing. Pages that are not in the search engines index will not be displayed in the search results, so getting the website indexed as fast as possible is very important. Techniques such as sitemaps and robots files, help the search engines learn which pages in the website should be indexed;
Sitemap.xml – XML sitemaps contain a list of each page within a website and are used by the search engines when crawling the site. See www.sitemap.org for more information;
Robots.txt – Robots.txt files are used to control what the search engines index and can by used to block certain pages, folders, or specific spiders from accessing parts of the website. Robots.txt files are also used to list sitemaps;
HTML Sitemap – An HTML or website Sitemap page, lists each page within the website and a link to each. This is used by the search engines when crawling the website to learn about its content;
Web Directories – As part of link building, web directories are a good way of receiving inbound links and some directories such as www.dmoz.org supply the search engines with content;
Title Tags – Title tags are the title of the website, as seen at the top of the browser. Title tags should contain keywords relating to the web pages content, although no more than 8 words should be used;
Internal Linking – The internal navigation of a website is important, as a well structured website is preferred by the search engines. This is achieved via the main navigation and also via keywords within the body of the content;
Anchor Text – Anchor text is the text of the link itself, where you might see a link using words such as “click here”, it is better to use words which relate to the content of the page it is linking to, so something like “contact us” is preferred;
External Links – Any links pointing out of the website, to another source will direct part of the “link juice” away from the website. This is resolved using a “nofollow” tag on the link itself;
Analysis – Analysis, using a statistics package is the most important part of Search Engine Optimisation, as without stats, the effectiveness of any SEO work can not be accurately measured;
Repeat – Once the information is analysed, anything ineffective can be built upon and the SEO cycle will start again.
If you’ve any questions about this blog post or any others that you’ve read from the Fasthosts Blog, then feel free to tweet me @fasthosts and use the hashtag #fhblog.