lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tim Mahy <>
Subject RE: Delete's increase while adding new documents
Date Wed, 07 May 2008 09:15:39 GMT
Hi all,

it seems that we just post to much to fast to Solr.

When we post 100 documents (seperate calls) and perform a commit everything goes well, but
as soon as we start sending thousands of documents and than use autocommit or send the commit
message we have the situation that there are a lot of documents not in the index although
they were sended to Solr ...

has anyone experience with how much documents you can import and at which speed so that Solr
stays stable ?

We use Tomcat 5.5 and our java memory limit is 2gb.

Van: Mike Klaas []
Verzonden: dinsdag 6 mei 2008 20:17
Onderwerp: Re: Delete's increase while adding new documents

On 6-May-08, at 4:56 AM, Tim Mahy wrote:

> Hi all,
> it seems that we get errors during the auto-commit :
> /opt/solr/upload/nl/archive/data/
> index/_4x.fnm (No such file or directory)
>        at Method)
>        at;init&gt;
> (
>        at$FSIndexInput
> $Descriptor.&lt;init&gt;(
>        at
> $FSIndexInput.&lt;init&gt;(
> the _4x.fnm file is not on the file system. When we switch from
> autocommit to manual commits throughout xml messages we get the same
> kind of errors.
> Any idea what could be wrong in our configuration to cause these
> exceptions ?

I have only heard of that error appearing in two cases.  Either the
index is corrupt, or something else deleted the file.  Are you sure
that there is only one Solr instance that accesses the directory, and
that nothing else ever touches it?

Can you reproduce the deletion issue with a small number of documents
(something that could be tested by one of us)?


Please see our disclaimer, 

View raw message