[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [syndication] Re: robots.txt and rss



Bravo.

It's about time.  The spec has had it for well over two years.

Although I suppose were it not for Radio relentlessly hammering feeds every hour
on the hour there wouldn't have been a bandwidth issue raised.

So while you get "Jeers" for failing to support the spec for lo these many
months, you get a "Cheer" for implementing it.

Are you calculating the skipHours based on local time or UTC offsets?

Are you supporting the RSS-1.0 syndication module?
http://web.resource.org/rss/1.0/modules/syndication/

There have been some concerns raised about the calculations used for backing
off.  The concern being that the update interval might "fall out of sync" if
Radio isn't running during that particular interval.  What, if anything, are you
doing to aid the users not running the program 24x7 (like laptop or office
workers)?

-Bill Kearney

----- Original Message -----
From: "Dave Winer" <dave@userland.com>
To: <syndication@yahoogroups.com>
Sent: Saturday, November 09, 2002 12:12 PM
Subject: [syndication] Re: robots.txt and rss


> Well Bill, you may have to say for once that UserLand did good, I'm
> just about finished with the implementation of support for skipHours
> in Radio. You can see it in action in my two feeds:
>
> http://www.scripting.com/rss.xml
> http://radio.weblogs.com/0001015/rss.xml
> And the aggregator has been updated to be skipHours-aware.
> There's more to the story on today's Scripting News.
> http://scriptingnews.userland.com/backissues/2002/11/09#When:8:29:13AM