Pricing
Written by Editorial Team on December 8, 2020

How To Create Robots.txt For WordPress Websites? [Under 5 Minutes]

Creating a Robots.txt is an important step when it comes to WordPress SEO. If you are wondering on how to create Robots.txt for WordPress websites, this tutorial is going to show you how it is done under 5 minutes or less.

How to check if Robots.txt is available (or not)?

For us, the first and most important step is to check if Robots.txt is available on your site. To check, type this on the search bar of Google (or any browser of your choice): my-domain-name.com/robots.txt (replace my-domain-name with your own domain name).

Check Robots.txt file
~How to check for Robots.txt file?

What is Robots.txt file and why is it important?

Robots.txt is a text file webmasters create to instruct web robots (typically search engine robots) how to crawl pages on their website. The robots.txt file is part of the the robots exclusion protocol (REP), a group of web standards that regulate how robots crawl the web, access and index content, and serve that content up to users. The REP also includes directives like meta robots, as well as page-, subdirectory-, or site-wide instructions for how search engines should treat links (such as “follow” or “nofollow”).

In practice, robots.txt files indicate whether certain user agents (web-crawling software) can or cannot crawl parts of a website. These crawl instructions are specified by “disallowing” or “allowing” the behaviour of certain (or all) user agents.

How to create a Robots.txt file easily?

Don't have a Robots.txt file? Don't worry. All you need is a Notepad app on your desktop or laptop and you can create a Robots.txt file.

Here's how it is done.

Step 1: Open notepad

Step 2: Copy this

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

Step 3: Add sitemap to your Robots.txt (optional)

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

Sitemap: https://domain-URL.com/sitemaps.xml

Remember to change domain-URL to your domain URL (and check the Sitemap).

Step 4: Save the file and name it as Robots.txt

Step 5: Upload the file to your root folder in your web hosting service (reach out to the support if you have no idea how to do it)

Other quick robots.txt must-knows:

  • In order to be found, a robots.txt file must be placed in a website’s top-level directory.
  • Robots.txt is case sensitive: the file must be named “robots.txt” (not Robots.txt, robots.TXT, or otherwise).
  • Some user agents (robots) may choose to ignore your robots.txt file. This is especially common with more nefarious crawlers like malware robots or email address scrapers.
  • The /robots.txt file is publicly available: just add /robots.txt to the end of any root domain to see that website’s directives (if that site has a robots.txt file!). This means that anyone can see what pages you do or don’t want to be crawled, so don’t use them to hide private user information.
  • Each subdomain on a root domain uses separate robots.txt files. This means that both blog.example.com and example.com should have their own robots.txt files (at blog.example.com/robots.txt and example.com/robots.txt).

Summary: Creating a Robots.txt file for WordPress

There you go, folks!

Creating a Robots.txt file for WordPress website is easy and if you don't have one, we recommend you to get it done right now!

It takes less than 5 minutes to be honest.

Article written by Editorial Team

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Posts

crossmenu