xmlgraphics-fop-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From thierry.pe...@snecma.fr
Subject Réf. : AW: Problem OutOfMemory
Date Mon, 02 Sep 2002 13:32:47 GMT
Hi Elsa, Hi Frank
Another way. We use FOP 0.20.2. We convert so XML files (sometimes very big
files) into PDF Files. We modify the last command in the script fop.sh

$JAVACMD      -Xmx512m      -classpath      "$LOCALCLASSPATH"     $FOP_OPTS
org.apache.fop.apps.Fop "$@"

We  add  the  option  Xmx (after you give after this option the size of the

We activate a FOP daemon. It analyses the size of every XML file:
> If the size is less than a value, FOP can convert all these files.
>  If  the size is greater than this value, FOP converts only one file. The
  next will be converted when the previous is converted.

Please let me know if there are any further details you require.

"Przybilla, Frank" <frank.przybilla@maxess.de> on 02/09/2002 14:43:45

Veuillez répondre à fop-user@xml.apache.org

Pour :    "'fop-user@xml.apache.org'" <fop-user@xml.apache.org>
Pour :    "'fop-user@xml.apache.org'"
cc :   (ccc : Thierry PETIT/SN06/SNECMA)
cc :
ccc : Thierry PETIT/SN06/SNECMA
ccc : Thierry PETIT

Objet :   AW: Problem OutOfMemory

Hi Elsa,
this  problem  occurs  when  you try to generate very large tables. In this
case fop has to hold them in memory. I had the same problems.

The only way (as I know so far) is to generate more than one page-sequence,
ideally  for  every  single  page.  This means that you have to compute the
number                                                                   of
rows you can place on every page.

With this strategy you should generate indefintely big tables with constant

"Elsa LARREUR" <elsalarreur@ifrance.com> on 02/09/2002 12:21:44

Veuillez répondre à fop-user@xml.apache.org

Pour :    fop-user@xml.apache.org
Pour :    fop-user
cc :   (ccc : Thierry PETIT/SN06/SNECMA)
cc :
ccc : Thierry PETIT/SN06/SNECMA
ccc : Thierry PETIT

Objet :   Problem OutOfMemory

I  have  trouble with FOP when I try to produce PDF file (with many tables)
from  a big file : I get an OutOfMemory error from FOP (FOP is started from
a JAVA application (with Driver.run())).

Is  someone  doing  such  a  job on large file using FOP? Is there a way to
avoid                             this                             problem?





View raw message