how to create robots by ayansh digital marketing consultancy
TRANSCRIPT
![Page 1: How to create Robots by Ayansh Digital Marketing Consultancy](https://reader035.vdocument.in/reader035/viewer/2022080419/58abe7e01a28ab504e8b4a55/html5/thumbnails/1.jpg)
How to Create a robots.txt file
![Page 2: How to create Robots by Ayansh Digital Marketing Consultancy](https://reader035.vdocument.in/reader035/viewer/2022080419/58abe7e01a28ab504e8b4a55/html5/thumbnails/2.jpg)
What are robots?
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned.{crawlers-process of transferring the data from the search engine(e.g google) to its database}
In short, robots restricts the pages/folders on the website which we do not want the search engine to show it all.
For e.g, In Ayansh Digital Marketing website, we need to robot about.html page. Hence it wont be visible locally but only personally.
![Page 3: How to create Robots by Ayansh Digital Marketing Consultancy](https://reader035.vdocument.in/reader035/viewer/2022080419/58abe7e01a28ab504e8b4a55/html5/thumbnails/3.jpg)
Visit the robot.txt generator site
![Page 4: How to create Robots by Ayansh Digital Marketing Consultancy](https://reader035.vdocument.in/reader035/viewer/2022080419/58abe7e01a28ab504e8b4a55/html5/thumbnails/4.jpg)
Add the page name on the files, then click add and observe the robots.txt file below
![Page 5: How to create Robots by Ayansh Digital Marketing Consultancy](https://reader035.vdocument.in/reader035/viewer/2022080419/58abe7e01a28ab504e8b4a55/html5/thumbnails/5.jpg)
The result should be saved in the folder with name robots.txt and uploaded from local site to remote site
![Page 6: How to create Robots by Ayansh Digital Marketing Consultancy](https://reader035.vdocument.in/reader035/viewer/2022080419/58abe7e01a28ab504e8b4a55/html5/thumbnails/6.jpg)
Have a final check before sending it to the designer Check your work inserting the below url
funmoviemasti.com/digitalmarketingconsultancy/robots.txt
Domain name web address/folder name default
![Page 7: How to create Robots by Ayansh Digital Marketing Consultancy](https://reader035.vdocument.in/reader035/viewer/2022080419/58abe7e01a28ab504e8b4a55/html5/thumbnails/7.jpg)
It would appear like this when checked on website where in *means all the links have been visible/uploaded except the link we disallowed/robot
![Page 8: How to create Robots by Ayansh Digital Marketing Consultancy](https://reader035.vdocument.in/reader035/viewer/2022080419/58abe7e01a28ab504e8b4a55/html5/thumbnails/8.jpg)
The following points do matter when it comes to topic of robots
To exclude all robots from the entire serverUser-Agent: *Disallow: / To allow all robots complete accessUser-Agent: *Disallow: (or just create an empty "/robots.txt" file, or don't use one at all) To exclude all robots from part of the serverUser-Agent: *Disallow: /cgi-bin/Disallow: /tmp/Disallow: /junk/ To exclude a single robotUser-Agent: BadbotDisallow: /
![Page 9: How to create Robots by Ayansh Digital Marketing Consultancy](https://reader035.vdocument.in/reader035/viewer/2022080419/58abe7e01a28ab504e8b4a55/html5/thumbnails/9.jpg)
To allow a single robotUser-Agent: GoogleDisallow: User-Agent: *Disallow: / To exclude all files except oneThis is currently a bit awkward, as there is no "Allow" field. The easy way is to put all files to be disallowed into a separate directory, say "stuff", and leave the one file in the level above this directory:User-Agent: *Disallow: /~joe/junk.htmlDisallow: /~joe/foo.htmlDisallow: /~joe/bar.html
![Page 10: How to create Robots by Ayansh Digital Marketing Consultancy](https://reader035.vdocument.in/reader035/viewer/2022080419/58abe7e01a28ab504e8b4a55/html5/thumbnails/10.jpg)
Thank You