xmlgraphics-fop-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jeremias Maerki <...@jeremias-maerki.ch>
Subject Re: Memory issue (was: Re: Major issue with image loading in FOP 0.95beta)
Date Thu, 08 May 2008 07:16:30 GMT
I've done extensive tests about memory allocation with FOP when
implementing the new image loading framework and in my case the memory
was always released. So, some results from a profiler would be helpful.

Anyway, what I meant with my hint was that the iText step might not be
necessary anymore and that you should be able to safely reduce -Xmx on
the JVM (provided your document doesn't contain to much non-image
content). But if you release FopFactory and the memory is still not
reclaimed something's wrong somewhere and I have a somewhat hard time
believing that FOP itself somehow still holds on to it. Please make sure
you don't hold on to the FOUserAgent and other FOP-related objects
because they might have a reference to the FopFactory.

On 08.05.2008 08:40:55 Jean-François El Fouly wrote:
> Jeremias Maerki a écrit :
> >> And my next problem is to find a way to force memory recycling after 
> >> this long and hefty FOP processing, but until further investigated this 
> >> is OT ;-)
> >>     
> >
> > You probably didn't get my hint earlier but with the new image loading
> > framework you should actually get away with lower memory settings. In my
> > tests I have been able to produce PDF with little text and many images
> > with 40MB of VM memory which wasn't possible with 0.94 and earlier.
> >   
> 
> Well, I got the hint, but it seems in contradiction with what I read.
> So to take the picture from a bit higher:
> - all XSL-FO transformation + FOP generation now work OK.
> - this generates 20-30 documents (chapters) for a grand total of about 
> 150 Mb, to be bound together by iText.
> - source XML is 1.5 Mb
> - 1011 PNG images for a total of 151 Mb, the largest image is 715 kb.
> 
> Now the figures:
> - XML -> XSL-FO transformation + FOP generation take 15 minutes on a 
> pretty decent DELL Server (running Debian 4.0) having all the physical 
> RAM possible (staging server for several customers)
> - JVM has 2000 Mb (which is BTW the grand max on this 
> processor/server/OS/JVM architecture)
> - only one instance of FOP launched (one document generation)
> - the second next step in the publication process (opening the 150 Mb 
> with iText to add the bookmarks) fails immediately (at file open) saying 
> it cannot allocate memory
> 
> If I try to investigate memory usage using 
> Runtime.getRuntime().getFreeMemory() and logging the figures with log4j, 
> these are the figures I get:
> - before XSLT + FOP: 1900 Mb free/2000 Mb
> - end of XSLT + FOP: 241 Mb free
> - set FopFactory instance to null as a desperate hint to the GC that FOP 
> objects could be/should be recycled
> - I force garbage collection using System.gc()    [OK, in an application 
> server this is a brute force approach, but could not see a more clever 
> maneuver ATM]
> - 350 Mb free/2000 Mb total
> - Bind all chapters with iText
> - 250 Mb free
> - Force another GC
> - 350 Mb free again (so the binding operation has no effect on the 
> available memory).
> - the next iText step still fails.
> 
> Now I don't take runtime.getXXXMemory() for bible word but at least it 
> "looks like" the Xalan + FOP subsystem hogs 1500 Mb of RAM which I 
> cannot recover.
> So I hired the team member who's competent in profiler usage next week 
> but I must say at the moment I'm still stuck :-(
> 
> Of course I've made my homework and read the f...riendly manual before 
> daring to ask.
> Did I miss any important indication ?
> 




Jeremias Maerki


---------------------------------------------------------------------
To unsubscribe, e-mail: fop-users-unsubscribe@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-help@xmlgraphics.apache.org


Mime
View raw message