jackrabbit-oak-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Tommaso Teofili (JIRA)" <j...@apache.org>
Subject [jira] [Assigned] (OAK-3827) TarMK JCR data -> Solr == Exception
Date Thu, 07 Jan 2016 13:50:39 GMT

     [ https://issues.apache.org/jira/browse/OAK-3827?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Tommaso Teofili reassigned OAK-3827:
------------------------------------

    Assignee: Tommaso Teofili

> TarMK JCR data -> Solr == Exception
> -----------------------------------
>
>                 Key: OAK-3827
>                 URL: https://issues.apache.org/jira/browse/OAK-3827
>             Project: Jackrabbit Oak
>          Issue Type: Bug
>          Components: solr
>            Reporter: PuzanovsP
>            Assignee: Tommaso Teofili
>              Labels: resilience
>
> One node was not in Solr after searching for it...
> After looking into logs have found following:
> ERROR - 2015-12-21 17:05:29.598; org.apache.solr.common.SolrException; null:org.apache.solr.common.SolrException:
Exception writing document id /content/dam/my/example.pdf/jcr:content/metadata/wn_previews:previews/wn_previews:spreads/1
to the index; possible analysis error.
>     at org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:168)
>     at org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:69)
>     at org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:51)
>     at org.apache.solr.update.processor.DistributedUpdateProcessor.doLocalAdd(DistributedUpdateProcessor.java:870)
>     at org.apache.solr.update.processor.DistributedUpdateProcessor.versionAdd(DistributedUpdateProcessor.java:1024)
>     at org.apache.solr.update.processor.DistributedUpdateProcessor.processAdd(DistributedUpdateProcessor.java:693)
>     at org.apache.solr.update.processor.LogUpdateProcessor.processAdd(LogUpdateProcessorFactory.java:100)
>     at org.apache.solr.handler.loader.JavabinLoader$1.update(JavabinLoader.java:96)
>     at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$1.readOuterMostDocIterator(JavaBinUpdateRequestCodec.java:166)
>     at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$1.readIterator(JavaBinUpdateRequestCodec.java:136)
>     at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:225)
>     at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$1.readNamedList(JavaBinUpdateRequestCodec.java:121)
>     at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:190)
>     at org.apache.solr.common.util.JavaBinCodec.unmarshal(JavaBinCodec.java:116)
>     at org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.unmarshal(JavaBinUpdateRequestCodec.java:173)
>     at org.apache.solr.handler.loader.JavabinLoader.parseAndLoadDocs(JavabinLoader.java:106)
>     at org.apache.solr.handler.loader.JavabinLoader.load(JavabinLoader.java:58)
>     at org.apache.solr.handler.UpdateRequestHandler$1.load(UpdateRequestHandler.java:92)
>     at org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:74)
>     at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:135)
>     at org.apache.solr.core.SolrCore.execute(SolrCore.java:1962)
>     at org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:777)
>     at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:418)
>     at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:207)
>     at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
>     at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:455)
>     at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
>     at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)
>     at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
>     at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1075)
>     at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:384)
>     at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
>     at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1009)
>     at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
>     at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
>     at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:154)
>     at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
>     at org.eclipse.jetty.server.Server.handle(Server.java:368)
>     at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:489)
>     at org.eclipse.jetty.server.BlockingHttpConnection.handleRequest(BlockingHttpConnection.java:53)
>     at org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:953)
>     at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1014)
>     at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:953)
>     at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)
>     at org.eclipse.jetty.server.BlockingHttpConnection.handle(BlockingHttpConnection.java:72)
>     at org.eclipse.jetty.server.bio.SocketConnector$ConnectorEndPoint.run(SocketConnector.java:264)
>     at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
>     at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
>     at java.lang.Thread.run(Unknown Source)
> Caused by: java.lang.IllegalArgumentException: Document contains at least one immense
term in field="wn_previews:base64_data" (whose UTF8 encoding is longer than the max length
32766), all of which were skipped.  Please correct the analyzer to not produce such terms.
 The prefix of the first immense term is: '[121, 113, 55, 102, 120, 121, 113, 55, 102, 120,
121, 113, 55, 102, 120, 121, 113, 55, 102, 120, 121, 113, 55, 102, 120, 121, 113, 55, 102,
120]...', original message: bytes can be at most 32766 in length; got 64627
>     at org.apache.lucene.index.DefaultIndexingChain$PerField.invert(DefaultIndexingChain.java:671)
>     at org.apache.lucene.index.DefaultIndexingChain.processField(DefaultIndexingChain.java:342)
>     at org.apache.lucene.index.DefaultIndexingChain.processDocument(DefaultIndexingChain.java:301)
>     at org.apache.lucene.index.DocumentsWriterPerThread.updateDocument(DocumentsWriterPerThread.java:241)
>     at org.apache.lucene.index.DocumentsWriter.updateDocument(DocumentsWriter.java:451)
>     at org.apache.lucene.index.IndexWriter.updateDocument(IndexWriter.java:1539)
>     at org.apache.solr.update.DirectUpdateHandler2.addDoc0(DirectUpdateHandler2.java:240)
>     at org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:164)
>     ... 48 more
> Caused by: org.apache.lucene.util.BytesRefHash$MaxBytesLengthExceededException: bytes
can be at most 32766 in length; got 64627
>     at org.apache.lucene.util.BytesRefHash.add(BytesRefHash.java:284)
>     at org.apache.lucene.index.TermsHashPerField.add(TermsHashPerField.java:151)
>     at org.apache.lucene.index.DefaultIndexingChain$PerField.invert(DefaultIndexingChain.java:645)
>     ... 55 more
> This can be seen in the default solr.log file
> As a result of this exception data transfer to solr is blocked and I have to manually
fix the affected data by removing that node



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message