How To Ensure The Search Engines Find Your Website
by: Craig Broadbent
One of the most fundamental aspects of search engine optimisation (SEO) is ensuring that the pages within your website are as accessible as possible to the search engines. It's not only the homepage of a website that can be indexed, but also the internal pages within a site's structure. The internal pages of a site often contain important content such as products, services or general information, and therefore can be uniquely optimised for related terms. As a result, easy access to these pages is vital.
There are many do's and don'ts involved in ensuring all of your pages can be found by search engines. However, it is important to first establish how the search engines find and index web pages.
Search engines use "robots" (also known as "bots" or "spiders") to find content on the web for inclusion in their index. A robot is a computer programme that can follow the hyperlinks on a web page, which is known as "crawling". When a robot finds a document it includes the contents within the search engine's index, then follows the next links it can find and continues the process of crawling and indexing. With this in mind, it becomes apparent that the navigational structure of a website is important in getting as many pages as possible indexed.
When considering the navigational structure of your site, the hierarchy of content should be considered. Search engines judge what they feel to be the most important pages of a site when considering rankings and a page's position in the site structure can influence this. The homepage is generally considered the most important page of a site - it is the top level document and usually attracts the most inbound links. From here, search engine robots can normally reach pages that are within three clicks of the homepage. Therefore, your most important pages should be one click away, the next important two clicks away and so forth.
The next thing to consider is how to link the pages together. Search engine robots can only follow generic HTML href links, meaning Flash links, JavaScript links, dropdown menus and submit buttons will all be inaccessible to robots. Links with query strings that have two or more parameters are also typically ignored, so be aware of this if you run a dynamically generated website.
The best links to use from an SEO perspective are generic HTML text links, as not only can they be followed by robots but the text contained in the anchor can also be used to describe the destination page – an optimisation plus point. Image links are also acceptable but the ability to describe the destination page is diminished, as the alt attribute is not given as much ranking weight as anchor text.
The most natural way to organise content on a website is to categorise it. Break down your products, services or information into related categories and then structure this so that the most important aspects are linked to from the homepage. If you have a vast amount of information for each category then again you will want to narrow your content down further. This could involve having articles on a similar topic, different types of product for sale, or content that can be broken down geographically. Categorisation is natural optimisation – the further you break down your information the more content you can provide and the more niche key phrases there are that can be targeted.
If you are still concerned that your important pages may not get indexed, then you can consider adding a sitemap to your website. A sitemap can be best described as an index page – it is a list of links to all of the pages within a site contained on one page. If you link to a sitemap from your homepage then it gives a robot easy access to all of the pages within your site. Just remember – robots typically can't follow more than 100 links from one page, so if your site is larger than this you may want to consider spreading your sitemap across several pages.
There are many considerations to make when optimising your site for search engines, and making your pages accessible to search engine robots should be the first step of your optimisation process. Following the advice above will help you make your entire site accessible and aid you in gaining multiple rankings and extra traffic.
About The Author
Craig Broadbent is Search Engine Optimisation Executive for UK-based internet marketing company, WebEvents Ltd. Clients of WebEvents benefit from a range of services designed to maximise ROI from internet marketing activities. To find out more, visit
http://www.webeventseurope.com.