nutch-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sebastian Nagel (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (NUTCH-1995) Add support for wildcard to http.robot.rules.whitelist
Date Sun, 17 May 2015 20:47:00 GMT

    [ https://issues.apache.org/jira/browse/NUTCH-1995?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14547337#comment-14547337
] 

Sebastian Nagel commented on NUTCH-1995:
----------------------------------------

Hi Guiseppe, wild cards are ok if it is about all hosts of a domain or the last byte of an
IP adress. But it should not be possible to bypass the robots.txt by setting the white list
to `*', cf. the initial discussion about NUTCH-1927 [[1|http://mail-archives.apache.org/mod_mbox/nutch-dev/201501.mbox/%3CD0EF8FF6.1DF31A%25chris.a.mattmann@jpl.nasa.gov%3E]].
 Users should be forced to include a domain name in the pattern, even patterns such as *.com,
*.co.uk are too unspecific. But comments are welcome regarding this point!
Instead of iterating over a list of regular expressions [o.a.n.util.SuffixStringMatcher|https://nutch.apache.org/apidocs/apidocs-1.10/org/apache/nutch/util/SuffixStringMatcher.html]
could be an efficient alternative. 

> Add support for wildcard to http.robot.rules.whitelist
> ------------------------------------------------------
>
>                 Key: NUTCH-1995
>                 URL: https://issues.apache.org/jira/browse/NUTCH-1995
>             Project: Nutch
>          Issue Type: Improvement
>          Components: robots
>    Affects Versions: 1.10
>            Reporter: Giuseppe Totaro
>            Assignee: Chris A. Mattmann
>              Labels: memex
>             Fix For: 1.11
>
>         Attachments: NUTCH-1995.patch
>
>
> The {{http.robot.rules.whitelist}} ([NUTCH-1927|https://issues.apache.org/jira/browse/NUTCH-1927])
configuration parameter allows to specify a comma separated list of hostnames or IP addresses
to ignore robot rules parsing for.
> Adding support for wildcard in {{http.robot.rules.whitelist}} could be very useful and
simplify the configuration, for example, if we need to give many hostnames/addresses. Here
is an example:
> {noformat}
> <name>http.robot.rules.whitelist</name>
>   <value>*.sample.com</value>
>   <description>Comma separated list of hostnames or IP addresses to ignore 
>   robot rules parsing for. Use with care and only if you are explicitly
>   allowed by the site owner to ignore the site's robots.txt!
>   </description>
> </property>
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message