lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Daniel Taurat" <>
Subject Out of memory in lucene 1.4.1 when re-indexing large number of documents
Date Thu, 09 Sep 2004 17:47:36 GMT
I am facing an out of memory problem using  Lucene 1.4.1.
I am  re-indexing a pretty large number ( about 30.000 ) of documents.
I identify old instances by checking for a unique ID field, delete those 
with indexReader.delete() and add the new document version.

HeapDump says I am having  a huge number of HashMaps with 
SegmentTermEnum objects (256891) .

IndexReader is closed directly after delete(term)...

Seems to me that this did not happen with version1.2 (same number of 
objects and  all...).
Has anyone an idea how I get  these "hanging"  objects? Or what to do in 
order to avoid them?


To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message