[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [syndication] Re: Robot Discovery
In article <9pffi0+jhr1@eGroups.com>, doug@rds.com writes
>One thing to consider is large/deep sites, such as multi-user sites
>using URLs of the form www.site.com/user. If there are thousands of
>users, a global discovery file could be huge. Perhaps what's needed
>is one per user, such as www.site.com/user/discovery.xml.
The way I constructed this (and you), it wouldn't be a problem.
/userlist.xml //list of user's discovery files
/user/discovery.xml //xml for one user.
I'm not sure if the robots.txt has a similar idea. Is it supposed to
work like .htaccess so that sub-directories over-ride parent
directories?
--
Julian Bond email: julian_bond@voidstar.com
CV/Resume: http://www.voidstar.com/cv/
WebLog: http://www.voidstar.com/
HomeURL: http://www.shockwav.demon.co.uk/
M: +44 (0)77 5907 2173 T: +44 (0)192 0412 433
ICQ:33679568 tag:So many words, so little time