diff options
author | Raymond Hettinger <python@rcn.com> | 2014-05-12 21:56:33 -0700 |
---|---|---|
committer | Raymond Hettinger <python@rcn.com> | 2014-05-12 21:56:33 -0700 |
commit | 122541beceeccce4ef8a9bf739c727ccdcbf2f28 (patch) | |
tree | 108f7af0fdfa681b11fb8685f6bf1a7741582c30 /Python/errors.c | |
parent | 73308d6869911d353cb58b353f64797cdba9bf0a (diff) | |
download | cpython-git-122541beceeccce4ef8a9bf739c727ccdcbf2f28.tar.gz |
Issue 21469: Mitigate risk of false positives with robotparser.
* Repair the broken link to norobots-rfc.txt.
* HTTP response codes >= 500 treated as a failed read rather than as a not
found. Not found means that we can assume the entire site is allowed. A 5xx
server error tells us nothing.
* A successful read() or parse() updates the mtime (which is defined to be "the
time the robots.txt file was last fetched").
* The can_fetch() method returns False unless we've had a read() with a 2xx or
4xx response. This avoids false positives in the case where a user calls
can_fetch() before calling read().
* I don't see any easy way to test this patch without hitting internet
resources that might change or without use of mock objects that wouldn't
provide must reassurance.
Diffstat (limited to 'Python/errors.c')
0 files changed, 0 insertions, 0 deletions