Wednesday 8 August 2012

How To Use ROBOTS.TXT Essential For Good SEO!

Maybe you have never heard of it, maybe you have seen it but don't know what it is for! Either way, the Robots.txt file that sits inside your root folder of your website on your hosting account is a powerful file when it comes to directing the search engines around your site.

Robots.txt is primarily used to tell the search engines which files you don't want indexed!

If you are anything like me, you will have files or simply parts of your site that you don't want to be a landing page for a prospective visitor. A great example of this would be a thank-you page. The search engines don't know that you only want people to see your thank-you page after they have completed some sort of transaction. Creating a robots.txt file will give you the opportunity to tell the search engines not to index this page.

This is how you set it up:

Open a basic text editor. 
On Windows, use notepad.
On Mac, use TextEdit (Format->Make Plain Text, then Save as Western)

enter the following:

User-agent: *
Disallow: /directory/file.html
Disallow: /directory/file.html

(replace the /directory/file.html with your info)

example:

website: dvd-to-ipad-software.com

I want to exclude privacy.html from the search results.
This file sits in the top level folder for this site so no directory info is required.
Its location is dvd-to-ipad-software.com/privacy.html
The file looks like this:

User-agent: *
Disallow: /privacy.html

I then save the file and call it robots.txt (all lower case - this is important)
I then upload it to my site and it sits in the same folder as my homepage (index.html) which is in the same location as my privacy.html file. 
You can have as many Disallow lines as you wish.

THAT'S IT!

No comments:

Post a Comment