kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 陈小军 <chenxiao...@nhn.com>
Subject Re: send snappy log got java.lang.NoClassDefFoundError:
Date Wed, 09 Apr 2014 02:01:34 GMT
Thanks, it is libstdc++ compatible problem, I recompile the library, and it works. 
 


Best Regards
Jerry
-----Original Message-----
From: "Bae, Jae Hyeon"&lt;metacret@gmail.com&gt; 
To: "users@kafka.apache.org"&lt;users@kafka.apache.org&gt;; "陈小军"&lt;chenxiaojun@nhn.com&gt;;

Cc: 
Sent: 2014-04-08 (星期二) 23:24:16
Subject: Re: send snappy log got java.lang.NoClassDefFoundError:

check your libstdc++ version

On Tuesday, April 8, 2014, 陈小军 &lt;chenxiaojun@nhn.com&gt; wrote: 



Hi all,



    I try to add compress feature on kafka-node js driver, for snappy I use node-snappy library
https://github.com/kesla/node-snappy. when I test my code, the server always output following
error, I don't know this is my code error, or server side compatible problem.


     I use same way to implement gzip compress it works OK. I think my logic is correct to
compress messageSet.



[2014-04-08 17:22:45,556] ERROR [KafkaApi-0] Error processing ProducerRequest with correlation
id 1 from client kafka-node-client on partition [nelo2-crash-logs,0] (kafka.server.KafkaApis)java.lang.NoClassDefFoundError:
Could not initialize class org.xerial.snappy.Snappy        at org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
       at org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)      
 at org.xerial.snappy.SnappyInputStream.&amp;lt;init&amp;gt;(SnappyInputStream.java:58)
       at kafka.message.CompressionFactory$.apply(CompressionFactory.scala:45)        at kafka.message.ByteBufferMessageSet$.decompress(ByteBufferMessageSet.scala:66)
       at kafka.message.ByteBufferMessageSet$$anon$1.makeNextOuter(ByteBufferMessageSet.scala:178)
       at kafka.message.ByteBufferMessageSet$$anon$1.makeNext(ByteBufferMessageSet.scala:191)
       at kafka.message.ByteBufferMessageSet$$anon$1.makeNext(ByteBufferMessageSet.scala:145)
       at kafka.utils.IteratorTemplate.maybeComputeNext(IteratorTemplate.scala:66)       
at kafka.utils.IteratorTemplate.hasNext(IteratorTemplate.scala:58)        at scala.collection.Iterator$$anon$19.hasNext(Iterator.scala:400)
       at scala.collection.Iterator$class.foreach(Iterator.scala:772)        at scala.collection.Iterator$$anon$19.foreach(Iterator.scala:399)
       at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)       
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:102)        at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:250)
       at scala.collection.Iterator$$anon$19.toBuffer(Iterator.scala:399)        at kafka.message.ByteBufferMessageSet.assignOffsets(ByteBufferMessageSet.scala:219)
       at kafka.log.Log.liftedTree1$1(Log.scala:249)        at kafka.log.Log.append(Log.scala:248)
       at kafka.cluster.Partition.appendMessagesToLeader(Partition.scala:354)        at kafka.server.KafkaApis$$anonfun$appendToLocalLog$2.apply(KafkaApis.scala:285)
       at kafka.server.KafkaApis$$anonfun$appendToLocalLog$2.apply(KafkaApis.scala:275)  
     at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233) 
      at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:233)
       at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:95)    
   at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:95)        at
scala.collection.Iterator$class.foreach(Iterator.scala:772)        at scala.collection.mutable.HashTable$$anon$1.foreach(HashTable.scala:157)
       at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:190)     
  at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:45)        at scala.collection.mutable.HashMap.foreach(HashMap.scala:95)
       at scala.collection.TraversableLike$class.map(TraversableLike.scala:233)        at
scala.collection.mutable.HashMap.map(HashMap.scala:45)        at kafka.server.KafkaApis.appendToLocalLog(KafkaApis.scala:275)
       at kafka.server.KafkaApis.handleProducerRequest(KafkaApis.scala:201)        at kafka.server.KafkaApis.handle(KafkaApis.scala:68)
       at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:42)        at java.lang.Thread.run(Thread.java:722)


Best Regards

Jerry


Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message