spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: Can's create Kafka stream in spark shell
Date Thu, 16 Oct 2014 19:11:45 GMT
Can you try:

sbt:

name := "Simple Project"


version := "1.1"


scalaVersion := "2.10.4"


libraryDependencies ++= Seq(

    "org.apache.spark" %% "spark-core" % "1.1.0",

    "org.apache.spark" %% "spark-streaming" % "1.1.0",

    "org.apache.spark" %% "spark-streaming-kafka" % "1.1.0"

  )

Thanks
Best Regards

On Fri, Oct 17, 2014 at 12:36 AM, Gary Zhao <garyzhao@gmail.com> wrote:

> Thanks Akhil. I tried spark-submit and saw the same issue. I double
> checked the versions and they look ok. Are you seeing any obvious issues?
>
> sbt:
>
> name := "Simple Project"
>
>
> version := "1.1"
>
>
> scalaVersion := "2.10.4"
>
>
> libraryDependencies ++= Seq(
>
>     "org.apache.spark" %% "spark-core" % "1.1.0",
>
>     "org.apache.spark" %% "spark-streaming" % "1.1.0",
>
>     "org.apache.spark" %% "spark-streaming-kafka" % "1.1.0",
>
>     "org.apache.kafka" %% "kafka" % "0.8.0"
>
>   )
>
>
> spark-1.1.0-bin-hadoop1/bin/spark-submit --class "main.scala.SimpleApp"
> --master "local[2]" simple-project_2.10-1.1.jar --jars
> spark-streaming-kafka_2.10-1.1.0.jar,kafka_2.10-0.8.0.jar
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/spark/streaming/kafka/KafkaUtils$
> at main.scala.SimpleApp$delayedInit$body.apply(SimpleApp.scala:15)
> at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
> at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
> at scala.App$$anonfun$main$1.apply(App.scala:71)
> at scala.App$$anonfun$main$1.apply(App.scala:71)
> at scala.collection.immutable.List.foreach(List.scala:318)
> at
> scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
> at scala.App$class.main(App.scala:71)
> at main.scala.SimpleApp$.main(SimpleApp.scala:11)
> at main.scala.SimpleApp.main(SimpleApp.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.spark.streaming.kafka.KafkaUtils$
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> ... 17 more
>
>
> On Tue, Oct 14, 2014 at 12:05 AM, Akhil Das <akhil@sigmoidanalytics.com>
> wrote:
>
>> Just make sure you have the same version of spark-streaming-kafka
>> <http://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka_2.10>
>> jar and spark in your classpath.
>>
>> Thanks
>> Best Regards
>>
>> On Tue, Oct 14, 2014 at 9:02 AM, Gary Zhao <garyzhao@gmail.com> wrote:
>>
>>> Hello
>>>
>>> I'm trying to connect kafka in spark shell, but failed as below. Could
>>> you take a look what I missed.
>>>
>>> scala>  val kafkaStream = KafkaUtils.createStream(ssc,
>>> "test-vip.snc1:2181", "test_spark", Map("user-test"->1))
>>> error: bad symbolic reference. A signature in KafkaUtils.class refers to
>>> term serializer
>>> in value kafka which is not available.
>>> It may be completely missing from the current classpath, or the version
>>> on
>>> the classpath might be incompatible with the version used when compiling
>>> KafkaUtils.class.
>>>
>>> Thanks
>>> Gary
>>>
>>
>>
>

Mime
View raw message