httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matt <>
Subject Re: [users@httpd] Allowing Robots.txt
Date Mon, 10 Oct 2011 17:35:09 GMT
> On October 10, 2011 12:45 , Matt <> wrote:
>> I want to restrict http access to the server to certain subnets,
>> require SSL and a username and password.  The exception is the
>> robots.txt file.  I want to allow anyone access to that.  How do I
>> tell it not to enforce a password or SSL only on robots.txt?
> Use the "Satisfy any" directive so that httpd will accept EITHER the
> host-based access control ("Allow from all") OR the user authentication
> ("Require valid-user") instead of requiring both of them as it does by
> default ("Satisfy all").  See the example at

That worked, thanks.  I also had to add "RewriteCond %{REQUEST_URI}
!=/robots.txt" to exempt it from SSL.

One other thing though.  Suppose I want to exempt certain directories
from requiring a password but still leave all remaining restrictions.
I have this there:

AuthName "Restricted Area"
AuthType Basic
AuthUserFile /var/www/.htpasswd
AuthGroupFile /dev/null
require valid-user

Is there a way to exempt say /downloads/ directory?

The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:> for more info.
To unsubscribe, e-mail:
   "   from the digest:
For additional commands, e-mail:

View raw message