Consultancy Agreement No. 9OC127 Consultancy Services for ...
How to create Robots by Ayansh Digital Marketing Consultancy
-
Upload
neha-nayak -
Category
Marketing
-
view
38 -
download
1
Transcript of How to create Robots by Ayansh Digital Marketing Consultancy
How to Create a robots.txt file
What are robots?
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned.{crawlers-process of transferring the data from the search engine(e.g google) to its database}
In short, robots restricts the pages/folders on the website which we do not want the search engine to show it all.
For e.g, In Ayansh Digital Marketing website, we need to robot about.html page. Hence it wont be visible locally but only personally.
Visit the robot.txt generator site
Add the page name on the files, then click add and observe the robots.txt file below
The result should be saved in the folder with name robots.txt and uploaded from local site to remote site
Have a final check before sending it to the designer Check your work inserting the below url
funmoviemasti.com/digitalmarketingconsultancy/robots.txt
Domain name web address/folder name default
It would appear like this when checked on website where in *means all the links have been visible/uploaded except the link we disallowed/robot
The following points do matter when it comes to topic of robots
To exclude all robots from the entire serverUser-Agent: *Disallow: / To allow all robots complete accessUser-Agent: *Disallow: (or just create an empty "/robots.txt" file, or don't use one at all) To exclude all robots from part of the serverUser-Agent: *Disallow: /cgi-bin/Disallow: /tmp/Disallow: /junk/ To exclude a single robotUser-Agent: BadbotDisallow: /
To allow a single robotUser-Agent: GoogleDisallow: User-Agent: *Disallow: / To exclude all files except oneThis is currently a bit awkward, as there is no "Allow" field. The easy way is to put all files to be disallowed into a separate directory, say "stuff", and leave the one file in the level above this directory:User-Agent: *Disallow: /~joe/junk.htmlDisallow: /~joe/foo.htmlDisallow: /~joe/bar.html
Thank You