httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matt <>
Subject Re: [users@httpd] Allowing Robots.txt
Date Mon, 10 Oct 2011 19:04:48 GMT
Is there a way to do something like this?

<Files / !=/robots.txt !=/downloads/>
AuthName "Restricted Area"
AuthType Basic
AuthUserFile /var/www/.htpasswd
AuthGroupFile /dev/null
require valid-user

I basically want to require a password everywhere but on robots.txt
and the downloads folder.  I want to still require encryption and
limit access by subnet on the downloads though just no password.
robots.txt needs to be open to the world.  Confusing even myself but
learning as I go.

>> One other thing though.  Suppose I want to exempt certain directories
>> from requiring a password but still leave all remaining restrictions.
>> I have this there:
>> AuthName "Restricted Area"
>> AuthType Basic
>> AuthUserFile /var/www/.htpasswd
>> AuthGroupFile /dev/null
>> require valid-user
>> Is there a way to exempt say /downloads/ directory?
> <Directory /path/to/downloads>
>    Allow from all
>    Satisfy any
> </Directory>
> This will make a username and password optional rather than required for the
> downloads directory and all files and subdirectories within it.  Assuming
> that you have not specified "Satisfy any" at any higher level, all other
> directories at and below the scope of the "AuthType Basic" directive will
> require a username and password.
> Since this is the same answer as in my previous response, I suspect I am not
> understanding your question correctly... can you elaborate?

The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:> for more info.
To unsubscribe, e-mail:
   "   from the digest:
For additional commands, e-mail:

View raw message