httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mark Montague <>
Subject Re: [users@httpd] Allowing Robots.txt
Date Mon, 10 Oct 2011 19:25:40 GMT
On October 10, 2011 15:04 , Matt <> wrote:
> Is there a way to do something like this?
> <Files / !=/robots.txt !=/downloads/>
> AuthName "Restricted Area"
> AuthType Basic
> AuthUserFile /var/www/.htpasswd
> AuthGroupFile /dev/null
> require valid-user
> </Files>

Yes, but not using that syntax.

> I basically want to require a password everywhere but on robots.txt
> and the downloads folder.  I want to still require encryption and
> limit access by subnet on the downloads though just no password.
> robots.txt needs to be open to the world.

# Require user authentication for everything:
AuthName "Restricted Area"
AuthType Basic
AuthGroupFile /dev/null
require valid-user

# Except, do not require user authentication for robots.txt
# Allow robots.txt to be accessed from everywhere:
<Files robots.txt>
     Order allow,deny
     Allow from all
     Satisfy any
# note that the above actually allows access to ANY file named
# robots.txt anywhere beneath the document root directory.
# If you really need to restrict everything except /robots.txt
# that's possible, but requires a more sophisticated solution.

# Also, do not require user authentication for the /downloads
# directory, but limit access to it by subnet:
<Directory /downloads>
     Order allow,deny
     Allow from
     Satisfy any

   Mark Montague

The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:> for more info.
To unsubscribe, e-mail:
   "   from the digest:
For additional commands, e-mail:

View raw message