nutch-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Markus Jelsma (JIRA)" <>
Subject [jira] [Updated] (NUTCH-1052) Multiple deletes of the same URL using SolrClean
Date Tue, 06 Sep 2011 10:34:09 GMT


Markus Jelsma updated NUTCH-1052:

    Attachment: NUTCH-1052-1.4-1.patch

Here's a patch adding a -delete switch to the solrindex command. It changes IndexerMapReduce
to output key,null for records with DB_GONE.

NutchDocument == null is caught in IndexerOutputFormat that will then call a writer.delete(key)

I am not sure if this is the correct approach but it works nicely. I did add the delete method's
signature to the NutchIndexWriter interface.

Please comment. If this is deemed appropriate i'll change the issue's title to reflect this
new approach.

> Multiple deletes of the same URL using SolrClean
> ------------------------------------------------
>                 Key: NUTCH-1052
>                 URL:
>             Project: Nutch
>          Issue Type: Improvement
>          Components: indexer
>    Affects Versions: 1.3, 1.4
>            Reporter: Tim Pease
>            Priority: Minor
>             Fix For: 1.4, 2.0
>         Attachments: NUTCH-1052-1.4-1.patch
> The SolrClean class does not keep track of purged URLs, it only checks the URL status
for "db_gone". When run multiple times the same list of URLs will be deleted from Solr. For
small, stable crawl databases this is not a problem. For larger crawls this could be an issue.
SolrClean will become an expensive operation.
> One solution is to add a "purged" flag in the CrawlDatum metadata. SolrClean would then
check this flag in addition to the "db_gone" status before adding the URL to the delete list.
> Another solution is to add a new state to the status field "db_gone_and_purged".
> Either way, the crawl DB will need to be updated after the Solr delete has successfully

This message is automatically generated by JIRA.
For more information on JIRA, see:


View raw message