lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Rohit" <ro...@in-rev.com>
Subject RE: Out of memory
Date Wed, 14 Sep 2011 06:10:03 GMT
Thanks Jaeger.

Actually I am storing twitter streaming data into the core, so the rate of
index is about 12tweets(docs)/second. The same solr contains 3 other cores
but these cores are not very heavy. Now the twitter core has become very
large (77516851) and its taking a long time to query (Mostly facet queries
based on date, string fields).

After sometime about 18-20hr solr goes out of memory, the thread dump
doesn't show anything. How can I improve this besides adding more ram into
the system.



Regards,
Rohit
Mobile: +91-9901768202
About Me: http://about.me/rohitg

-----Original Message-----
From: Jaeger, Jay - DOT [mailto:Jay.Jaeger@dot.wi.gov] 
Sent: 13 September 2011 21:06
To: solr-user@lucene.apache.org
Subject: RE: Out of memory

numDocs is not the number of documents in memory.  It is the number of
documents currently in the index (which is kept on disk).  Same goes for
maxDocs, except that it is a count of all of the documents that have ever
been in the index since it was created or optimized (including deleted
documents).

Your subject indicates that something is giving you some kind of Out of
memory error.  We might better be able to help you if you provide more
information about your exact problem.

JRJ


-----Original Message-----
From: Rohit [mailto:rohit@in-rev.com] 
Sent: Tuesday, September 13, 2011 2:29 PM
To: solr-user@lucene.apache.org
Subject: Out of memory

I have solr running on a machine with 18Gb Ram , with 4 cores. One of the
core is very big containing 77516851 docs, the stats for searcher given
below

 

searcherName : Searcher@5a578998 main 
caching : true 
numDocs : 77516851 
maxDoc : 77518729 
lockFactory=org.apache.lucene.store.NativeFSLockFactory@5a9c5842 
indexVersion : 1308817281798 
openedAt : Tue Sep 13 18:59:52 GMT 2011 
registeredAt : Tue Sep 13 19:00:55 GMT 2011 
warmupTime : 63139

 

.         Is there a way to reduce the number of docs loaded into memory for
this core?

.         At any given time I dont need data more than past 15 days, unless
someone queries for it explicetly. How can this be achieved?

.         Will it be better to go for Solr replication or distribution if
there is little option left

 

 

Regards,

Rohit

Mobile: +91-9901768202

About Me:  <http://about.me/rohitg> http://about.me/rohitg

 


Mime
View raw message