lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matthew Hall <>
Subject Re: Parsing large xml files
Date Fri, 22 May 2009 12:58:42 GMT
2g... should not be a maximum for any Jvm that I know of.

Assuming you are running a 32 bit Jvm you are actually able to address a 
bit under 4G of memory, I've always used around 3.6G when trying to max 
out a 32 bit jvm.  Technically speaking it should be able to address 4g 
under a 32 bit or, however a certain percentage of the memory is set 
aside for overhead, so you can only really use a bit less than the max.

If you have a 64 bit os/jvm (which you likely might), you can use the 
-d64 setting for your runtime environment to set your maximum memory 
much.. MUCH higher, for example we regularly use 6G of memory on our 
application servers here at the lab.

Hope this helps you a bit,

Matt wrote:
> ----- Original Message ----- 
> From: "Sithu D. Sudarsan" <> 
> To: 
> Sent: Thursday, May 21, 2009 7:42:59 AM GMT -08:00 US/Canada Pacific 
> Subject: Parsing large xml files 
> Hi, 
> While trying to parse xml documents of about 50MB size, we run into 
> OutOfMemoryError due to java heap space. Increasing JVM to use close 2GB 
> (that is the max), does not help. Is there any API that could be used to 
> handle such large single xml files? 
> If Lucene is not the right place, please let me know alternate places to 
> look for, 
> Thanks in advance, 
> Sithu D Sudarsan 

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message