Robots.txt: who is looking for the files you want to keep hidden

When hackers first probe a site for vulnerabilities, there is one thing that is almost always on their list, robots.txt. This is a special file for search engine crawlers telling them which pages or files from your site they should or shouldn’t parse and index. Pay attention to the second part, it tells search engines which files …

Continue reading