Google robots.txt defends itself on HalloweenPosted on
Estimated reading time: 1 minute, 15 seconds
Google has decided to add 3 lines into their robots.txt this Halloween. In a fun attempt to play with us nosy people that read source code.
Question is, will they be safe? Or will someone be able to egg/teepee the google homepage?
Halloween, or Hallowe’en (a contraction of All Hallows’ Evening), also known as Allhalloween, All Hallows’ Eve, or All Saints’ Eve, is a celebration observed in a number of countries on 31 October, the eve of the Western Christian feast of All Hallows’ Day. It begins the three-day observance of Allhallowtide, the time in the liturgical year dedicated to remembering the dead, including saints (hallows), martyrs, and all the faithful departed.
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned. Robots are often used by search engines to categorize web sites. Not all robots cooperate with the standard; email harvesters, spambots and malware robots that scan for security vulnerabilities may even start with the portions of the website where they have been told to stay out. The standard is different from, but can be used in conjunction with Sitemaps, a robot inclusion standard for websites.