lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Fuad Efendi" <f...@efendi.ca>
Subject RE: JVM Heap utilization & Memory leaks with Solr
Date Thu, 13 Aug 2009 17:28:59 GMT
Most OutOfMemoryException (if not 100%) happening with SOLR are because of
http://lucene.apache.org/java/2_4_0/api/org/apache/lucene/search/FieldCache.
html
- it is used internally in Lucene to cache Field value and document ID. 

My very long-term observations: SOLR can run without any problems few
days/months and unpredictable OOM happens just because someone tried sorted
search which will populate array with IDs of ALL documents in the index.

The only solution: calculate exactly amount of RAM needed for FieldCache...
For instance, for 100,000,000 documents single instance of FieldCache may
require 8*100,000,000 bytes (8 bytes per document ID?) which is almost 1Gb
(at least!)


I didn't notice any memory leaks after I started to use 16Gb RAM for SOLR
instance (almost a year without any restart!)




-----Original Message-----
From: Rahul R [mailto:rahul.solr@gmail.com] 
Sent: August-13-09 1:25 AM
To: solr-user@lucene.apache.org
Subject: Re: JVM Heap utilization & Memory leaks with Solr

*You should try to generate heap dumps and analyze the heap using a tool
like the Eclipse Memory Analyzer. Maybe it helps spotting a group of
objects holding a large amount of memory*

The tool that I used also allows to capture heap snap shots. Eclipse had a
lot of pre-requisites. You need to apply some three or five patches before
you can start using it........ My observations with this tool were that some
Hashmaps were taking up a lot of space. Although I could not pin it down to
the exact HashMap. These would either be weblogic's or Solr's.... I will
anyway give eclipse's a try and see how it goes. Thanks for your input.

Rahul

On Wed, Aug 12, 2009 at 2:15 PM, Gunnar Wagenknecht
<gunnar@wagenknecht.org>wrote:

> Rahul R schrieb:
> > I tried using a profiling tool - Yourkit. The trial version was free for
> 15
> > days. But I couldn't find anything of significance.
>
> You should try to generate heap dumps and analyze the heap using a tool
> like the Eclipse Memory Analyzer. Maybe it helps spotting a group of
> objects holding a large amount of memory.
>
> -Gunnar
>
> --
> Gunnar Wagenknecht
> gunnar@wagenknecht.org
> http://wagenknecht.org/
>
>



Mime
View raw message