lucene-java-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From moraleslos <>
Subject Re: issues with optimizer
Date Wed, 06 Jun 2007 18:26:45 GMT


actually its more of a "general" question rather than a compass specific
one.  Here's my complete process:
I have incoming data being indexed every hour.  The data varies from 100 to
10000 documents.  I'm also having the index optimized via Compass (using its
Adaptive or Aggressive optimizer, which in turn uses Lucene).  Its currently
scheduled every 10 seconds.  In the beginning, when my index just started to
build, optimization was instantaneous.  Now, currently at 20GB, each
optimization takes a while.  Because each optimization takes a bit of time,
there is a lock on the index.  If the index is locked too long, the
application that is trying to index new data will stall and...

So, for the "general" question, what would be an ideal situation of
optimizing an index that may reach 100GB in size?  Currently I don't split
anything up (just one big index).  Any suggestions?  Thanks.


hossman_lucene wrote:
> I have no idea what it is you are asking ... it seems to be very compass
> specifci .. perhaps you should be consulting a Compass User group?
> : I'm running into the "WARN | Compass Scheduled Optimizer |
> AdaptiveOptimizer
> : | ne.optimizer.AdaptiveOptimizer 104 | Failed to obtain lock on
> sub-index
> -Hoss
> ---------------------------------------------------------------------
> To unsubscribe, e-mail:
> For additional commands, e-mail:

View this message in context:
Sent from the Lucene - Java Users mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message