nutch-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sebastian Nagel (JIRA)" <>
Subject [jira] [Commented] (NUTCH-1927) Create a whitelist of IPs/hostnames to allow skipping of RobotRules parsing
Date Mon, 13 Apr 2015 14:03:12 GMT


Sebastian Nagel commented on NUTCH-1927:

* http.robot.rules.whitelist should be empty per default :)
* the description says "hostnames or IP addresses" - is IP address white listing supported?
* instead of repeatedly splitting whitelisted hosts at ',' use conf.getStrings(...) to initially
fill the white list
* also the white list is a set and should be stored as such to avoid iterating over the list
as in isWhiteListed()
* Why is it necessary to create in Fetcher for every URL a new WhiteListRobotRules object?
Wouldn't it be simpler (and more efficient) to use the existing cache in RobotRulesParser
and just put a reference to a singleton white list rules object if the host is element of
the white list?

> Create a whitelist of IPs/hostnames to allow skipping of RobotRules parsing
> ---------------------------------------------------------------------------
>                 Key: NUTCH-1927
>                 URL:
>             Project: Nutch
>          Issue Type: New Feature
>          Components: fetcher
>            Reporter: Chris A. Mattmann
>            Assignee: Chris A. Mattmann
>              Labels: available, patch
>             Fix For: 1.10
>         Attachments: NUTCH-1927.Mattmann.041115.patch.txt, NUTCH-1927.Mattmann.041215.patch.txt
> Based on discussion on the dev list, to use Nutch for some security research valid use
cases (DDoS; DNS and other testing), I am going to create a patch that allows a whitelist:
> {code:xml}
> <property>
>   <name>robot.rules.whitelist</name>
>   <value>,,</value>
>   <description>Comma separated list of hostnames or IP addresses to ignore robot
rules parsing for.
>   </description>
> </property>
> {code}

This message was sent by Atlassian JIRA

View raw message