How to Add a Robots.txt File2021-04-09T18:34:49+00:00

How to add a robots.txt file to your site

A robots text file, or robots.txt file (often mistakenly referred to as a robot.txt file) is a must have for every website. Adding a robots.txt file to the root folder of your site is a very simple process, and having this file is actually a ‘sign of quality’ to the search engines. Let’s look at the robots.txt options available to your site.

What is a robots text file?

A robots.txt is simply an ASCII or plain text file that tells the search engines where they are not allowed to go on a site – also known as the Standard for Robot Exclusion. Any files or folders listed in this document will not be crawled and indexed by the search engine spiders. Having a robots.txt, even a blank one, shows you acknowledge that search engines are allowed on your site and that they may have free access to it. We recommend adding a robots text file to your main domain and all sub-domains on your site.

Robots.txt options for formatting

Writing a robots.txt is an easy process. Follow these simple steps:

  • Open Notepad, Microsoft Word or any text editor and save the file as ‘robots,’ all lowercase, making sure to choose .txt as the file type extension (in Word, choose ‘Plain Text’ ).
  • Next, add the following two lines of text to your file:

User-agent: *
Disallow:

‘User-agent’ is another word for robots or search engine spiders. The asterisk (*) denotes that this line applies to all of the spiders. Here, there is no file or folder listed in the Disallow line, implying that every directory on your site may be accessed. This is a basic robots text file.

  • Blocking the search engine spiders from your whole site is also one of the robots.txt options. To do this, add these two lines to the file:

User-agent: *
Disallow: /

  • If you’d like to block the spiders from certain areas of your site, your robots.txt might look something like this:

User-agent: *
Disallow: /database/
Disallow: /scripts/

The above three lines tells all robots that they are not allowed to access anything in the database and scripts directories or sub-directories. Keep in mind that only one file or folder can be used per Disallow line. You may add as many Disallow lines as you need.

  • Be sure to add your search engine friendly XML sitemap file to the robots text file. This will ensure that the spiders can find your sitemap and easily index all of your site’s pages. Use this syntax:

Sitemap: http://www.mydomain.com/sitemap.xml

  • Once complete, save and upload your robots.txt file to the root directory of your site. For example, if your domain is www.mydomain.com, you will place the file at www.mydomain.com/robots.txt.
  • Once the file is in place, check the robots.txt file for any errors.

The Search Guru can help implement this and other technical SEO elements. Contact us today to get started!

Recent Posts

What Really Keeps Marketing Heroes Up at Night?

October 2nd, 2023|Comments Off on What Really Keeps Marketing Heroes Up at Night?

Introduction On our podcast, "Marketing Heroes [Unfiltered]," Leslie Carruthers and Danny Muscoplat ask top B2B marketers: What keeps you up at night? The answers we’re getting are insightful and sometimes surprising. Index/ TLDR

  • Hreflang Tags Best Practices: Unlocking Global Success

Unlocking Global Success: Smart Implementation of Powerhouse Hreflang Tag Delivers

March 8th, 2023|Comments Off on Unlocking Global Success: Smart Implementation of Powerhouse Hreflang Tag Delivers

Caution: Missing, incorrect or unrecognized hreflang tags sabotage multilingual websites, costing you leads and sales TL;DR (Introduction) Hreflang are super handy tags that tell Google your website has a few versions of the content