lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Brandon Waterloo <>
Subject Problems indexing very large set of documents
Date Mon, 04 Apr 2011 18:00:53 GMT
 Hey everybody,

I've been running into some issues indexing a very large set of documents.  There's about
4000 PDF files, ranging in size from 160MB to 10KB.  Obviously this is a big task for Solr.
 I have a PHP script that iterates over the directory and uses PHP cURL to query Solr to index
the files.  For now, commit is set to false to speed up the indexing, and I'm assuming that
Solr should be auto-committing as necessary.  I'm using the default solrconfig.xml file included
in apache-solr-1.4.1\example\solr\conf.  Once all the documents have been finished the PHP
script queries Solr to commit.

The main problem is that after a few thousand documents (around 2000 last time I tried), nearly
every document begins causing Java exceptions in Solr:

Apr 4, 2011 1:18:01 PM org.apache.solr.common.SolrException log
SEVERE: org.apache.solr.common.SolrException: org.apache.tika.exception.TikaException: TIKA-198:
Illegal IOException from org.apache.tika.parser.pdf.PDFParser@11d329d
        at org.apache.solr.handler.extraction.ExtractingDocumentLoader.load(
        at org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(
        at org.apache.solr.handler.RequestHandlerBase.handleRequest(
        at org.apache.solr.core.RequestHandlers$LazyRequestHandlerWrapper.handleRequest(
        at org.apache.solr.core.SolrCore.execute(
        at org.apache.solr.servlet.SolrDispatchFilter.execute(
        at org.apache.solr.servlet.SolrDispatchFilter.doFilter(
        at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(
        at org.mortbay.jetty.servlet.ServletHandler.handle(
        at org.mortbay.jetty.servlet.SessionHandler.handle(
        at org.mortbay.jetty.handler.ContextHandler.handle(
        at org.mortbay.jetty.webapp.WebAppContext.handle(
        at org.mortbay.jetty.handler.ContextHandlerCollection.handle(
        at org.mortbay.jetty.handler.HandlerCollection.handle(
        at org.mortbay.jetty.handler.HandlerWrapper.handle(
        at org.mortbay.jetty.Server.handle(
        at org.mortbay.jetty.HttpConnection.handleRequest(
        at org.mortbay.jetty.HttpConnection$RequestHandler.content(
        at org.mortbay.jetty.HttpParser.parseNext(
        at org.mortbay.jetty.HttpParser.parseAvailable(
        at org.mortbay.jetty.HttpConnection.handle(
        at org.mortbay.thread.BoundedThreadPool$
Caused by: org.apache.tika.exception.TikaException: TIKA-198: Illegal IOException from org.apache.tika.parser.pdf.PDFParser@11d329d
        at org.apache.tika.parser.CompositeParser.parse(
        at org.apache.tika.parser.AutoDetectParser.parse(
        at org.apache.solr.handler.extraction.ExtractingDocumentLoader.load(
        ... 23 more
Caused by: expected='endobj' firstReadAttempt='' secondReadAttempt=''
        at org.pdfbox.pdfparser.PDFParser.parseObject(
        at org.pdfbox.pdfparser.PDFParser.parse(
        at org.pdfbox.pdmodel.PDDocument.load(
        at org.pdfbox.pdmodel.PDDocument.load(
        at org.apache.tika.parser.pdf.PDFParser.parse(
        at org.apache.tika.parser.CompositeParser.parse(
        ... 25 more

As far as I know there's nothing special about these documents so I'm wondering if it's not
properly autocommitting.  What would be appropriate settings in solrconfig.xml for this particular
application?  I'd like it to autocommit as soon as it needs to but no more often than that
for the sake of efficiency.  Obviously it takes long enough to index 4000 documents and there's
no reason to make it take longer.  Thanks for your help!

~Brandon Waterloo

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message