spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: Can's create Kafka stream in spark shell
Date Fri, 17 Oct 2014 06:35:53 GMT
This is how you deal with deduplicate errors:

libraryDependencies ++= Seq(
  ("org.apache.spark" % "spark-streaming_2.10" % "1.1.0" % "provided").
    *exclude("org.eclipse.jetty.orbit", "javax.transaction").*
*    exclude("org.eclipse.jetty.orbit", "javax.mail").*
*    exclude("org.eclipse.jetty.orbit", "javax.activation").*
*    exclude("com.esotericsoftware.minlog", "minlog").*
*    exclude("commons-beanutils", "commons-beanutils-core").*
*    exclude("commons-logging", "commons-logging").*
*    exclude("commons-collections", "commons-collections").*
*    exclude("org.eclipse.jetty.orbit", "javax.servlet")*
)


Thanks
Best Regards

On Fri, Oct 17, 2014 at 2:53 AM, Gary Zhao <garyzhao@gmail.com> wrote:

> Same error. I saw someone reported the same issue, e.g.
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-streaming-kafka-error-td9106.html
>
> Should I use "sbt assembly"? It failed for deduplicate though.
>
> error] (*:assembly) deduplicate: different file contents found in the
> following:
> [error]
> /Users/gzhao/.ivy2/cache/org.eclipse.jetty.orbit/javax.transaction/orbits/javax.transaction-1.1.1.v201105210645.jar:META-INF/ECLIPSEF.RSA
> [error]
> /Users/gzhao/.ivy2/cache/org.eclipse.jetty.orbit/javax.servlet/orbits/javax.servlet-3.0.0.v201112011016.jar:META-INF/ECLIPSEF.RSA
> [error]
> /Users/gzhao/.ivy2/cache/org.eclipse.jetty.orbit/javax.mail.glassfish/orbits/javax.mail.glassfish-1.4.1.v201005082020.jar:META-INF/ECLIPSEF.RSA
> [error]
> /Users/gzhao/.ivy2/cache/org.eclipse.jetty.orbit/javax.activation/orbits/javax.activation-1.1.0.v201105071233.jar:META-INF/ECLIPSEF.RSA
> [error] Total time: 4 s, completed Oct 16, 2014 1:58:41 PM
>
>
> On Thu, Oct 16, 2014 at 12:11 PM, Akhil Das <akhil@sigmoidanalytics.com>
> wrote:
>
>> Can you try:
>>
>> sbt:
>>
>> name := "Simple Project"
>>
>>
>> version := "1.1"
>>
>>
>> scalaVersion := "2.10.4"
>>
>>
>> libraryDependencies ++= Seq(
>>
>>     "org.apache.spark" %% "spark-core" % "1.1.0",
>>
>>     "org.apache.spark" %% "spark-streaming" % "1.1.0",
>>
>>     "org.apache.spark" %% "spark-streaming-kafka" % "1.1.0"
>>
>>   )
>>
>> Thanks
>> Best Regards
>>
>> On Fri, Oct 17, 2014 at 12:36 AM, Gary Zhao <garyzhao@gmail.com> wrote:
>>
>>> Thanks Akhil. I tried spark-submit and saw the same issue. I double
>>> checked the versions and they look ok. Are you seeing any obvious issues?
>>>
>>> sbt:
>>>
>>> name := "Simple Project"
>>>
>>>
>>> version := "1.1"
>>>
>>>
>>> scalaVersion := "2.10.4"
>>>
>>>
>>> libraryDependencies ++= Seq(
>>>
>>>     "org.apache.spark" %% "spark-core" % "1.1.0",
>>>
>>>     "org.apache.spark" %% "spark-streaming" % "1.1.0",
>>>
>>>     "org.apache.spark" %% "spark-streaming-kafka" % "1.1.0",
>>>
>>>     "org.apache.kafka" %% "kafka" % "0.8.0"
>>>
>>>   )
>>>
>>>
>>> spark-1.1.0-bin-hadoop1/bin/spark-submit --class "main.scala.SimpleApp"
>>> --master "local[2]" simple-project_2.10-1.1.jar --jars
>>> spark-streaming-kafka_2.10-1.1.0.jar,kafka_2.10-0.8.0.jar
>>>
>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>> org/apache/spark/streaming/kafka/KafkaUtils$
>>> at main.scala.SimpleApp$delayedInit$body.apply(SimpleApp.scala:15)
>>> at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
>>> at
>>> scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
>>> at scala.App$$anonfun$main$1.apply(App.scala:71)
>>> at scala.App$$anonfun$main$1.apply(App.scala:71)
>>> at scala.collection.immutable.List.foreach(List.scala:318)
>>> at
>>> scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
>>> at scala.App$class.main(App.scala:71)
>>> at main.scala.SimpleApp$.main(SimpleApp.scala:11)
>>> at main.scala.SimpleApp.main(SimpleApp.scala)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>> Caused by: java.lang.ClassNotFoundException:
>>> org.apache.spark.streaming.kafka.KafkaUtils$
>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>> ... 17 more
>>>
>>>
>>> On Tue, Oct 14, 2014 at 12:05 AM, Akhil Das <akhil@sigmoidanalytics.com>
>>> wrote:
>>>
>>>> Just make sure you have the same version of spark-streaming-kafka
>>>> <http://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka_2.10>
>>>> jar and spark in your classpath.
>>>>
>>>> Thanks
>>>> Best Regards
>>>>
>>>> On Tue, Oct 14, 2014 at 9:02 AM, Gary Zhao <garyzhao@gmail.com> wrote:
>>>>
>>>>> Hello
>>>>>
>>>>> I'm trying to connect kafka in spark shell, but failed as below. Could
>>>>> you take a look what I missed.
>>>>>
>>>>> scala>  val kafkaStream = KafkaUtils.createStream(ssc,
>>>>> "test-vip.snc1:2181", "test_spark", Map("user-test"->1))
>>>>> error: bad symbolic reference. A signature in KafkaUtils.class refers
>>>>> to term serializer
>>>>> in value kafka which is not available.
>>>>> It may be completely missing from the current classpath, or the
>>>>> version on
>>>>> the classpath might be incompatible with the version used when
>>>>> compiling KafkaUtils.class.
>>>>>
>>>>> Thanks
>>>>> Gary
>>>>>
>>>>
>>>>
>>>
>>
>

Mime
View raw message