Hacker News new | past | comments | ask | show | jobs | submit login

Googlebot won't follow the link because it's listed in robots.txt.



Googlebot caches robots.txt for a very, very long time. If you disallow a directory it may take months for the entire googlebot fleet to start ignoring it. Google's official stance is that you should manage disallow directives through webmaster tools.


Yes, but it will index it.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: