Robots.txt File in WordPress to Google

73

Robots.txt is a file that tells Google what not to index. A robots.txt file is used when you want to remove a specific Directives from the Google search index. The latest information just came from Google official announcement that robots file will not be used further. But still, there is no confirmation. So Here is a sample on how to create a Robots.txt file in different platforms.

Create Robots.txt File in WordPress through Yoast SEO plugin

When you are using the WordPress platform for your website. And you wish to create a Robots file. It is the easiest method to block your URL’s from indexing in Google.

Follow the steps below to create a Robots.txt file

  • Open your admin panel and add a new plugin called Yoast SEO plugin
  • Now go to the setting of Yoast SEO Plugin and select tools
  • Click on file editor
  • A screen will appear in front says robots.txt
Yaost SEO Plugin

You have to edit this file to give directions to Google that these directives should not be index for your website.

Now you have to block your URL’s or directives in below format.

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-content
Disallow: /wp-includes
Allow: /wp-admin/admin-ajax.php
sitemap: http://searchland.org/sitemap.xml

In the Above format, Disallow is the URL’s or Directive that you want Google, not to crawl. and we additionally tell Google the direction of our sitemap.

When you want to disallow a specific URL just type your URL after disallow
and If you want to blog a whole directive you type in direction after your domain. (Like /wp-admin) This will result in blockage of all the URL starting for wp-admin.

Robots.txt Checker

Now you have to check whether the URL you have submitting to block is actually blocked or not.

So there are several methods to do this.

First, you can check it manually from Google. by Entering your URL followed bur robots.txt.(http://searchland.org/robots.txt)

Manual Google Check for Robots.txt

This will enable you to see the same set of format as we have created in Yoast SEO plugin in WordPress.

The second method to ensure Robots.txt blocked URLs.

  • Login to your Google webmasters account connected to your website.
  • From the Side, menu select Go to old Version
Old Version Google Webmasters
  • Select Crawl and go to robots.txt tester
  • Now you can see the same file you uploaded in WordPress.
GSC Old Robots.txt File

When you see on this page below you will see a space after your domain name where you can type in your website pages or URL.

This will allow you to test where this URL is blocked by google or not.

Block File Checker for Robots.txt

Type in the direction and click on submit it will tell you whether the URL is blocked or not.

This is it. You have correctly created and Submitted your Robots.txt File to Google. Most probably it will push your Google ranking further up.

Rishabh Sharma

Hello Myself Rishabh Sharma and I am an SEO expert and elaborating my knowledge through this website. I am motivated and capable to take any business to the top level in an online world.

Leave a Reply

Your email address will not be published. Required fields are marked *