spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Arun Lists <lists.a...@gmail.com>
Subject Re: Running Spark application from command line
Date Tue, 13 Jan 2015 18:08:59 GMT
Yes, I am running with Scala 2.11. Here is what I see when I do "scala
-version"

> scala -version

Scala code runner version 2.11.4 -- Copyright 2002-2013, LAMP/EPFL

On Tue, Jan 13, 2015 at 2:30 AM, Sean Owen <sowen@cloudera.com> wrote:

> It sounds like possibly a Scala version mismatch? are you sure you're
> running with Scala 2.11 too?
>
> On Tue, Jan 13, 2015 at 6:58 AM, Arun Lists <lists.arun@gmail.com> wrote:
> > I have a Spark application that was assembled using sbt 0.13.7, Scala
> 2.11,
> > and Spark 1.2.0. In build.sbt, I am running on Mac OSX Yosemite.
> >
> > I use "provided" for the Spark dependencies. I can run the application
> fine
> > within sbt.
> >
> > I run into problems when I try to run it from the command line. Here is
> the
> > command I use:
> >
> > ADD_JARS=analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar scala -cp
> >
> /Applications/spark-1.2.0-bin-hadoop2.4/lib/spark-assembly-1.2.0-hadoop2.4.0.jar:analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar
> > com.dtex.analysis.transform.GenUserSummaryView ...
> >
> > I get the following error messages below. Please advise what I can do to
> > resolve this issue. Thanks!
> >
> > arun
> >
> > 15/01/12 22:47:18 WARN NativeCodeLoader: Unable to load native-hadoop
> > library for your platform... using builtin-java classes where applicable
> >
> > 15/01/12 22:47:18 WARN BlockManager: Putting block broadcast_0 failed
> >
> > java.lang.NoSuchMethodError:
> > scala.collection.immutable.$colon$colon.hd$1()Ljava/lang/Object;
> >
> > at
> >
> org.apache.spark.util.collection.SizeTracker$class.takeSample(SizeTracker.scala:84)
> >
> > at
> >
> org.apache.spark.util.collection.SizeTracker$class.resetSamples(SizeTracker.scala:61)
> >
> > at
> >
> org.apache.spark.util.collection.SizeTrackingVector.resetSamples(SizeTrackingVector.scala:25)
> >
> > at
> >
> org.apache.spark.util.collection.SizeTracker$class.$init$(SizeTracker.scala:51)
> >
> > at
> >
> org.apache.spark.util.collection.SizeTrackingVector.<init>(SizeTrackingVector.scala:25)
> >
> > at
> org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:236)
> >
> > at
> org.apache.spark.storage.MemoryStore.putIterator(MemoryStore.scala:136)
> >
> > at
> org.apache.spark.storage.MemoryStore.putIterator(MemoryStore.scala:114)
> >
> > at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:787)
> >
> > at
> org.apache.spark.storage.BlockManager.putIterator(BlockManager.scala:638)
> >
> > at
> org.apache.spark.storage.BlockManager.putSingle(BlockManager.scala:992)
> >
> > at
> >
> org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:98)
> >
> > at
> >
> org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:84)
> >
> > at
> >
> org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
> >
> > at
> >
> org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29)
> >
> > at
> >
> org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
> >
> > at org.apache.spark.SparkContext.broadcast(SparkContext.scala:945)
> >
> > at org.apache.spark.SparkContext.hadoopFile(SparkContext.scala:695)
> >
> > at org.apache.spark.SparkContext.textFile(SparkContext.scala:540)
> >
> > at
> >
> com.dtex.analysis.transform.TransformUtils$anonfun$2.apply(TransformUtils.scala:97)
> >
> > at
> >
> com.dtex.analysis.transform.TransformUtils$anonfun$2.apply(TransformUtils.scala:97)
> >
> > at
> >
> scala.collection.TraversableLike$anonfun$map$1.apply(TraversableLike.scala:245)
> >
> > at
> >
> scala.collection.TraversableLike$anonfun$map$1.apply(TraversableLike.scala:245)
> >
> > at
> >
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
> >
> > at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
> >
> > at scala.collection.TraversableLike$class.map(TraversableLike.scala:245)
> >
> > at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186)
> >
> > at
> >
> com.dtex.analysis.transform.TransformUtils$.generateUserSummaryData(TransformUtils.scala:97)
> >
> > at
> >
> com.dtex.analysis.transform.GenUserSummaryView$.main(GenUserSummaryView.scala:77)
> >
> > at
> >
> com.dtex.analysis.transform.GenUserSummaryView.main(GenUserSummaryView.scala)
> >
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >
> > at java.lang.reflect.Method.invoke(Method.java:483)
> >
> > at
> >
> scala.reflect.internal.util.ScalaClassLoader$anonfun$run$1.apply(ScalaClassLoader.scala:70)
> >
> > at
> >
> scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
> >
> > at
> >
> scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:101)
> >
> > at
> >
> scala.reflect.internal.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:70)
> >
> > at
> >
> scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:101)
> >
> > at scala.tools.nsc.CommonRunner$class.run(ObjectRunner.scala:22)
> >
> > at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:39)
> >
> > at scala.tools.nsc.CommonRunner$class.runAndCatch(ObjectRunner.scala:29)
> >
> > at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:39)
> >
> > at
> scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:65)
> >
> > at scala.tools.nsc.MainGenericRunner.run$1(MainGenericRunner.scala:87)
> >
> > at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:98)
> >
> > at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:103)
> >
> > at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
> >
>

Mime
View raw message