lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Norberto Meijome <free...@meijome.net>
Subject Re: Indexing very large files.
Date Thu, 06 Sep 2007 05:02:01 GMT
On Wed, 05 Sep 2007 17:18:09 +0200
Brian Carmalt <bca@contact.de> wrote:

> I've bin trying to index a 300MB file to solr 1.2. I keep getting out of 
> memory heap errors.
> Even on an empty index with one Gig of vm memory it sill won't work.

Hi Brian,

VM != heap memory.

VM = OS memory
heap memory = memory made available by the JavaVM to the Java process. Heap memory errors
are hardly ever an issue of the app itself (other , of course, with bad programming... but
it doesnt seem to be issue here so far)


[betom@ayiin] [Thu Sep  6 14:59:21 2007]
/usr/home/betom
$ java -X
[...]
    -Xms<size>        set initial Java heap size
    -Xmx<size>        set maximum Java heap size
    -Xss<size>        set java thread stack size
[...]

For example, start solr as :
java  -Xms64m -Xmx512m   -jar start.jar

YMMV with respect to the actual values you use.

Good luck,
B
_________________________
{Beto|Norberto|Numard} Meijome

Windows caters to everyone as though they are idiots. UNIX makes no such assumption. 
It assumes you know what you are doing, and presents the challenge of figuring it out for
yourself if you don't.

I speak for myself, not my employer. Contents may be hot. Slippery when wet. Reading disclaimers
makes you go blind. Writing them is worse. You have been Warned.

Mime
View raw message