Googlebot caches robots.txt for a very, very long time. If you disallow a directory it may take months for the entire googlebot fleet to start ignoring it. Google's official stance is that you should manage disallow directives through webmaster tools.