[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [syndication] Re: robots.txt and rss
> sometimes the authors notice if you start handing them 403 responses,
> sometimes not. from yesterday, these hosts got 403 responses from my
> feeds:
>
> 48 cassium.procopia.com
> 47 csociety.ecn.purdue.edu
> 72 usersweb1.go-concepts.com
> 20 wwwcache2-ext.lancs.ac.uk
> 40 wwwcache3-ext.lancs.ac.uk
Then how about setting up a sticky trap that opens a session that stays open?
Or sending them a godzilla-gram of megabytes of pseudo-random XML data? Havng
them run out of diskspace or crashing the script is certainly one way to get
their attention.
This is precisely the sort of reason I've suggested not using HTTP errors alone
in handling feeds going offline or moving to a new location. Something that
indicates, in the feed, that it's being moved to a new URL would be one way for
anything reading the feed to understand where to go instead.
But yes, supporting genuine HTTP error codes IS important.
-Bill Kearney