Why does Google use “Allow” in robots.txt, when the standard seems to be “Disallow?”
4th February 2012
My answer to Why does Google use “Allow” in robots.txt, when the standard seems to be “Disallow?” on Quora
The Disallow command prevents search engines from crawling your site.
The Allow command allows them to crawl your site.
If you’re using the Google Webmaster tools, you probably want Google to crawl your site.
Am I misunderstanding your question?
More recent articles
- LLM 0.22, the annotated release notes - 17th February 2025
- Run LLMs on macOS using llm-mlx and Apple's MLX framework - 15th February 2025
- URL-addressable Pyodide Python environments - 13th February 2025