[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Robot Discovery



One thing to consider is large/deep sites, such as multi-user sites 
using URLs of the form www.site.com/user. If there are thousands of 
users, a global discovery file could be huge. Perhaps what's needed 
is one per user, such as www.site.com/user/discovery.xml.

Robots.txt works globally because (a) it's exclusionary, and (b) it 
takes wildcards. I think this is a little different.

    ...doug

Doug Kaye
doug@Rds.com