Recently I have had a lot of exposure to SEO (Search Engine Optimisation) and it turns out that trying to find real information about search engine optimisation isn't as easy as it would seem, you would think that people who have a real insight into the effects behind SEO would be online happily sharing their discoveries with other developers but unfortunately the ones who do are generally uneducated, and the others want you to pay for the information. I am not claiming myself as a fully qualified expert in search engine optimisation and all of its real effects across search engines (not just Google), however, I do feel that there is some real lacking content out there on this subject, and as such I would like to give my opinion and knowledge of SEO, its at least worth what it costs (FREE) unlike most companies out there which are peddling claims to get to the top of Google for frivilous un-used search terms at extortionate prices. Here I am going to explain how to structure content correctly, and hopefully a few tips for search engine optimisation you haven't heard of.
The truth is there are so many factors involved in SEO that it is hard to structure details about it, and Google/Yahoo/Bing (MSN) are constantly advancing and changing their algorithms involved in ranking your web page, so where do you start?
You do this through using robots.txt, this file allows you to allow or deny access to specific areas of your website to bots, and also will let you specify if you want certain search engines to see your site or not. Below is the simplest of lines which allows all search engine bots (and any other bots for that matter) access to everything it can find on your website.
If you wish you can look at my robots.txt at http://www.udjamaflip.com/robots.txt as you can see it stops robots from going to certain areas of the website that it doesn't need to look at, and also it mentions where my sitemap can be found (more on this later).
This can be done two different ways, the first is easy and can take time which is to goto each of the search engines and submit your website directly to them (this can take upto 90days to get approved). The better option, however, is to go some existing websites that have similiar content to your website and is regularly updated (this means Google will be checking it regularly), and then get your website linked from that website to yours wether it is in a forum signature or on some friends blogrolls.
The first part of helping a search engine bot crawl through your website is to tell it where everything is, the best way of doing this is by using a Sitemap. These are XML files which a search engine robot will be looking for as a matter of course as soon as it starts to look at your website. You can see my sitemap.xml and look at the syntax, but just a quick overview of the elements you can see the most important are the link, title, and priority. The link is obviously the location of the page, the title is the title of the page, and the priority is a value between 0.1 and 1.0 which lets the robots know which of your pages are the most prominent and important in your website, fill these correctly as it helps further crawling of the website. The priority does not have any effect on search engine page ranking.
The key thing to remember when writing the content for your website that it is accessible, and the website HTML validates. Although these aren't directly related to SEO per say, some believe that a more validated and accessible website is easier for robots to crawl your website which does make sense in theory. However, the most important part of this article and I cannot stress this enough is to just write your content for a human being, not a robot. There are a lot of search engine optimisation 'professionals' who believe that spamming their key words through their content even when irrelevant to the content will help their Google ranking - it won't, and in fact it could even take marks away for said search term due to overuse and being labelled as spamming. Additionally duplicating content is poor practice, naturally most websites will interlink and sub-sectorise content which does mean there will be 2-3 different instances of one particular article or post, however, creating several different pages with the same content and then linking it to each other is not acceptable and Google will know whilst crawling your website that you are trying to trick it. Having said these two no-no's there are a few things whilst writing your content to take in mind:
Unfortunately most people you know won't have a website of their own to link to yours and help traffic, however, using social media websites such as facebook, myspace, or twitter you can generate larger amounts of traffic to your website and generate external links to your website which in turn increase search engine rankings, this is due to a few things such as these websites having a good PR rating, and a large amount of content which will match your websites content in some way. The best way to do this is by creating group pages for your website, or parts of your website and then share this with others in the social media websites, and providing your content is interesting it will be passed on and others will link to others and so on.
I have left these items till last, as although most people believe they are the most important parts of search engine optimisation the truth is that search engine friendly urls aren't proven to help SEO but more related to aid website crawling, and META tags have been tested to show that they are increasingly being used less and less by search engines to identify what search terms a page should be ranked highly for, however, both items aren't completed ruled out in helping websites from acheiving a higher rank when used alongside the other techniques I have previously mentioned. The best way to ensure that your META tags aid your search engine page ranking is to strongly correlate it to your main content, this is because that when Google looks at these META items, if it isn't related directly to the pages content it will deem one of them either incorrect, or unusable (normally the META tags).
Search Engine Friendly URLs are good accessibility, and were originally made for bookmarking to keep good url structure and easy to remember urls. The URLs are thought to be used by search engines such as Google to identify website infrastructure and therefore help define related content which in turn aids your page rank within Search engine results.