lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sadaf <Sadaf.As...@tpsgc-pwgsc.gc.ca>
Subject OOM error
Date Fri, 23 Dec 2016 17:14:50 GMT
Hi, 

This is the index we are using:
Number of fields: 355. 
Number of documents: 225 thousand. 
Number of terms: 5522 thousand. 
The index size is around 800MB.
TermInfos index divisor: 1
Index format: Lucene 4.0. 
We are getting Java OutOfMemory error with searches. We are using a heap
size of 1GB. We are not able to increase our heap size.
Looking at the HeapDump, we have two suspects: FieldCacheImpl and
FieldCacheImpl$BinaryDocValuesImpl.
The searches we are doing have a lot of sorts. The sorting is being done on
string fields. I don’t have much experience with OutOfMemory errors or with
Lucene. What should I try to do?
-	Should I try to play with the swappiness on the server. Currently set to
60?
-	We are creating a new IndexSearcher for each search? Should I just have
one? If I do that, will my results be updated as new documents are being
added to the index?
-	If you think that sorting on StringFields is the problem can you give me
some pointers on what the usual suspects are?
-      Should I insist that we need more Heap?
Thanks, (I will not be checking my email until new year. Wishing everyone
here Happy Holidays)
Sadaf



--
View this message in context: http://lucene.472066.n3.nabble.com/OOM-error-tp4311073.html
Sent from the Lucene - Java Users mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscribe@lucene.apache.org
For additional commands, e-mail: java-user-help@lucene.apache.org


Mime
View raw message