robot txt: what? and how it is used in seo

2
Robot.txt: What? and How It is Used? Robots.txt is a web file used to let know search engine robots, crawlers and spiders about which areas of your domain you want to let them visit and index. It does not prevent access by itself and should never be used to "block" content that contains sensitive information. Improper usage of the robots.txt file can hurt your ranking The robots.txt file controls how search engine spiders see and interact with your webpages This file is mentioned in several of the Google guidelines This file, and the bots they interact with, are fundamental parts of how search engines work How Robots.txt Work Search engines send out tiny programs called “spiders” or “robots” to search your site and bring information back to the search engines so that the pages of your site can be indexed in the search results and found by web users. Your Robots.txt file instructs these programs not to search pages on your site which you designate using a “disallow” command. For example, the following Robots.txt commands: To allow crawling of everything User-agent: * Disallow:

Upload: jeebananda-nayak

Post on 12-Apr-2017

10 views

Category:

Business


1 download

TRANSCRIPT

Page 1: Robot txt: What? and How It is Used In Seo

Robot.txt: What? and How It is Used?Robots.txt is a web file used to let know search engine robots, crawlers and spiders about which areas of your domain you want to let them visit and index. It does not prevent access by itself and should never be used to "block" content that contains sensitive information.

Improper usage of the robots.txt file can hurt your ranking The robots.txt file controls how search engine spiders see and interact with your webpages This file is mentioned in several of the Google guidelines This file, and the bots they interact with, are fundamental parts of how search engines work

How Robots.txt Work

Search engines send out tiny programs called “spiders” or “robots” to search your site and bring information back to the search engines so that the pages of your site can be indexed in the search results and found by web users. Your Robots.txt file instructs these programs not to search pages on your site which you designate using a “disallow” command. For example, the following Robots.txt commands:

To allow crawling of everything

User-agent: *

Disallow:

so it all depends on you which folder or page you want to be indexed by a search engine or not just you have to instruct the robots.txt file how to treat the search bots because first of all search bots visit robots.txt file where they collect all the necessary information how to crawl a website.

Saving File: Save it as robot.txt and upload in the FileZilla (Webserver)

Page 2: Robot txt: What? and How It is Used In Seo

Best Service Agency