xml-xindice-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vadim Gritsenko <va...@reverycodes.com>
Subject Re: xml getting corrupted
Date Mon, 26 Apr 2004 19:19:05 GMT
Sankalp Jain wrote:
...

>   There can be any number of jobs between 1 - 200 that
>concurrently update a single document. Also the document
>size can vary from a few kbs to a few Mbs (between 4-5 Mbs)
>
>   We have noticed that whenever there is a large amount of
>concurrent activity the document is corrupted.
>
>There are a few questions that I would like to ask :
>
>1. Is there a limit on document size that Xindice can
>handle?
>  
>

Xindice is not suitable for large documents, it was designed to work 
with smaller documents. There it no hard limit for document size, but 
with larger documents performance and memory consuption will increase. 
4-5 Mb per document is too large file, IMHO, for Xindice.


>2. Also How many concurrent writes to the same document are
>OK? Are there any known problems regarding this issue?
>  
>

What do you mean under "concurrent writes"? Are you using xupdate, or 
overwriting whole document each time? For XUpdate, see:
    http://nagoya.apache.org/bugzilla/show_bug.cgi?id=13745


If you come up with Unit test exposing your issue, it will be easier to 
find and fix the problem. See
  
http://cvs.apache.org/viewcvs.cgi/xml-xindice/java/tests/src/org/apache/xindice/core/filer/FilerTestBase.java?rev=1.3&view=auto
method testConcurrentInsert() for an example of the concurrency test.

Vadim


Mime
View raw message