manifoldcf-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Karl Wright <daddy...@gmail.com>
Subject Re: [JCIFS Connector] crawl job stop on access error
Date Fri, 09 Dec 2016 16:14:31 GMT
Hi Julien,

There's already code in place to treat this error as a ServiceInterruption,
which means that the document will be retried.  However, after it is
retried a certain amount the code gives up and terminates the job.  We
could change this behavior to make it skip the document at that point.

Please open a Jira CONNECTORS ticket so that we can deal with this properly.

Thanks!
Karl


On Fri, Dec 9, 2016 at 4:08 AM, Julien Massiera <
julien.massiera@francelabs.com> wrote:

> Hi the MCF community,
>
> I experienced, during a crawl with the JCIFS connector, a common error
> that happens when trying to access a lock file : "SmbException thrown: The
> process cannot access the file because it is being used by another process."
> The problem is that the job has stopped and I would like to avoid this
> kind of behavior and ignore the exception (and the file) to keep going.
> What would you recommend ? Have I missed something in the configuration of
> the job ?
> I know that the simplest way would be to add a filter that matches the
> pattern of a lock file but I really want to avoid the job to stop if a lock
> file is encountered and not filtered.
>
> Thanks
>
> --
> Julien MASSIERA
> Expert en technologies de recherche
> France Labs – Les experts du Search
> Vainqueur du challenge Internal Search de EY à Viva Technologies 2016
> www.francelabs.com
> Tel : +33 (0) 663778847
>
>

Mime
View raw message