spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Tathagata Das (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-2103) Java + Kafka + Spark Streaming NoSuchMethodError in java.lang.Object.<init>
Date Fri, 01 Aug 2014 11:34:39 GMT

     [ https://issues.apache.org/jira/browse/SPARK-2103?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Tathagata Das updated SPARK-2103:
---------------------------------

    Fix Version/s: 1.1.0

> Java + Kafka + Spark Streaming NoSuchMethodError in java.lang.Object.<init>
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-2103
>                 URL: https://issues.apache.org/jira/browse/SPARK-2103
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>    Affects Versions: 1.0.0
>            Reporter: Sean Owen
>             Fix For: 1.1.0
>
>
> This has come up a few times, from user venki-kratos:
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-td2209.html
> and I ran into it a few weeks ago:
> http://mail-archives.apache.org/mod_mbox/spark-dev/201405.mbox/%3CCAMAsSdLzS6ihcTxepUsphRyXxA-wp26ZGBxx83sM6niRo0q4Rg@mail.gmail.com%3E
> and yesterday user mpieck:
> {quote}
> When I use the createStream method from the example class like
> this:
> KafkaUtils.createStream(jssc, "zookeeper:port", "test", topicMap);
> everything is working fine, but when I explicitely specify message decoder
> classes used in this method with another overloaded createStream method:
> KafkaUtils.createStream(jssc, String.class, String.class,
> StringDecoder.class, StringDecoder.class, props, topicMap,
> StorageLevels.MEMORY_AND_DISK_2);
> the applications stops with an error:
> 14/06/10 22:28:06 ERROR kafka.KafkaReceiver: Error receiving data
> java.lang.NoSuchMethodException:
> java.lang.Object.<init>(kafka.utils.VerifiableProperties)
>         at java.lang.Class.getConstructor0(Unknown Source)
>         at java.lang.Class.getConstructor(Unknown Source)
>         at
> org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:108)
>         at
> org.apache.spark.streaming.dstream.NetworkReceiver.start(NetworkInputDStream.scala:126)
> {quote}
> Something is making it try to instantiate java.lang.Object as if it's a Decoder class.
> I suspect that the problem is to do with
> https://github.com/apache/spark/blob/master/external/kafka/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala#L148
> {code}
>     implicit val keyCmd: Manifest[U] =
> implicitly[Manifest[AnyRef]].asInstanceOf[Manifest[U]]
>     implicit val valueCmd: Manifest[T] =
> implicitly[Manifest[AnyRef]].asInstanceOf[Manifest[T]]
> {code}
> ... where U and T are key/value Decoder types. I don't know enough Scala to fully understand
this, but is it possible this causes the reflective call later to lose the type and try to
instantiate Object? The AnyRef made me wonder.
> I am sorry to say I don't have a PR to suggest at this point.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Mime
View raw message