spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Kevin (Sangwoo) Kim" <kevin...@apache.org>
Subject Re: Use of nscala-time within spark-shell
Date Tue, 17 Feb 2015 01:10:09 GMT
What is your scala version used to build Spark?
It seems your nscala-time library scala version is 2.11,
and default Spark scala version is 2.10.


On Tue Feb 17 2015 at 1:51:47 AM Hammam CHAMSI <hschamsi@hotmail.com> wrote:

> Hi All,
>
> Thanks in advance for your help. I have timestamp which I need to convert
> to datetime using scala. A folder contains the three needed jar files:
> "joda-convert-1.5.jar  joda-time-2.4.jar  nscala-time_2.11-1.8.0.jar"
> Using scala REPL and adding the jars: scala -classpath "*.jar"
> I can use nscala-time like following:
>
> scala> import com.github.nscala_time.time.Imports._
> import com.github.nscala_time.time.Imports._
>
> scala> import org.joda._
> import org.joda._
>
> scala> DateTime.now
> res0: org.joda.time.DateTime = 2015-02-12T15:51:46.928+01:00
>
> But when i try to use spark-shell:
> ADD_JARS=/home/scala_test_class/nscala-time_2.11-1.8.0.jar,/home/scala_test_class/joda-time-2.4.jar,/home/scala_test_class/joda-convert-1.5.jar
> /usr/local/spark/bin/spark-shell --master local --driver-memory 2g
> --executor-memory 2g --executor-cores 1
>
> It successfully imports the jars:
>
> scala> import com.github.nscala_time.time.Imports._
> import com.github.nscala_time.time.Imports._
>
> scala> import org.joda._
> import org.joda._
>
> but fails using them
> scala> DateTime.now
> java.lang.NoSuchMethodError:
> scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
>         at
> com.github.nscala_time.time.LowPriorityOrderingImplicits$class.ReadableInstantOrdering(Implicits.scala:69)
>
>         at
> com.github.nscala_time.time.Imports$.ReadableInstantOrdering(Imports.scala:20)
>
>         at
> com.github.nscala_time.time.OrderingImplicits$class.$init$(Implicits.scala:61)
>
>         at com.github.nscala_time.time.Imports$.<init>(Imports.scala:20)
>         at com.github.nscala_time.time.Imports$.<clinit>(Imports.scala)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:17)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:22)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24)
>         at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:26)
>         at $iwC$$iwC$$iwC$$iwC.<init>(<console>:28)
>         at $iwC$$iwC$$iwC.<init>(<console>:30)
>         at $iwC$$iwC.<init>(<console>:32)
>         at $iwC.<init>(<console>:34)
>         at <init>(<console>:36)
>         at .<init>(<console>:40)
>         at .<clinit>(<console>)
>         at .<init>(<console>:7)
>         at .<clinit>(<console>)
>         at $print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
>         at
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
>         at
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
>         at
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
>         at
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
>         at
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
>         at
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
>
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
>         at
> org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)
>         at
> org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)
>         at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641)
>         at
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:968)
>
>         at
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
>
>         at
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
>
>         at
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
> Your help is very aappreciated,
>
> Regards,
>
> Hammam
>

Mime
View raw message