spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: Upgrading to Spark 1.0.0 causes NoSuchMethodError
Date Thu, 26 Jun 2014 06:51:29 GMT
Try deleting the .iv2 directory in your home and then do a sbt clean
assembly would solve this issue i guess.

Thanks
Best Regards


On Thu, Jun 26, 2014 at 3:10 AM, Robert James <srobertjames@gmail.com>
wrote:

> In case anyone else is having this problem, deleting all ivy's cache,
> then doing a sbt clean, then recompiling everything, repackaging, and
> reassemblying, seems to have solved the problem.  (From the sbt docs,
> it seems that having to delete ivy's cache means a bug in sbt)
>
> On 6/25/14, Robert James <srobertjames@gmail.com> wrote:
> > Thanks Paul.  I'm unable to follow the discussion on SPARK-2075.  But
> > how would you recommend I test or follow up on that? Is there a
> > workaround?
> >
> > On 6/25/14, Paul Brown <prb@mult.ifario.us> wrote:
> >> Hi, Robert --
> >>
> >> I wonder if this is an instance of SPARK-2075:
> >> https://issues.apache.org/jira/browse/SPARK-2075
> >>
> >> -- Paul
> >>
> >> —
> >> prb@mult.ifario.us | Multifarious, Inc. | http://mult.ifario.us/
> >>
> >>
> >> On Wed, Jun 25, 2014 at 6:28 AM, Robert James <srobertjames@gmail.com>
> >> wrote:
> >>
> >>> On 6/24/14, Robert James <srobertjames@gmail.com> wrote:
> >>> > My app works fine under Spark 0.9.  I just tried upgrading to Spark
> >>> > 1.0, by downloading the Spark distro to a dir, changing the sbt file,
> >>> > and running sbt assembly, but I get now NoSuchMethodErrors when
> trying
> >>> > to use spark-submit.
> >>> >
> >>> > I copied in the SimpleApp example from
> >>> > http://spark.apache.org/docs/latest/quick-start.html and get the
> same
> >>> > error:
> >>> >
> >>> > $/usr/local/share/spark/bin/spark-submit --class SimpleApp
> >>> > target/scala-2.10/myproj-assembly-1.0.jar
> >>> > Spark assembly has been built with Hive, including Datanucleus jars
> on
> >>> > classpath
> >>> > Exception in thread "main" java.lang.NoSuchMethodError:
> >>> >
> >>>
> org.apache.spark.SparkContext$.$lessinit$greater$default$2()Lscala/collection/Map;
> >>> >       at SimpleApp$.main(SimpleApp.scala:10)
> >>> >       at SimpleApp.main(SimpleApp.scala)
> >>> >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>> >       at
> >>> >
> >>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >>> >       at
> >>> >
> >>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >>> >       at java.lang.reflect.Method.invoke(Method.java:601)
> >>> >       at
> >>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
> >>> >       at
> >>> > org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
> >>> >       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> >>> >
> >>> > How can I migrate to Spark 1.0.0?
> >>> >
> >>>
> >>> I've done `sbt clean`, deleted the entire ivy2 cache, and still get
> >>> the above error on both my code and the official Spark example.  Can
> >>> anyone guide me on how to debug this?
> >>>
> >>> How does Spark find the /usr/local/share/spark directory? Is there a
> >>> variable somewhere I need to set to point to that, or that might point
> >>> to the old spark? I've left the old spark dir on the machine (just
> >>> changed the symlink) - can that be causing problems?
> >>>
> >>> How should I approach this?
> >>>
> >>
> >
>

Mime
View raw message