spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michael Armbrust <mich...@databricks.com>
Subject Re: issue with coalesce in Spark 2.0.0
Date Wed, 03 Aug 2016 17:06:46 GMT
Spark 2.0 is not binary compatible with Spark 1.x, you'll need to recompile
your jar.

On Tue, Aug 2, 2016 at 2:57 AM, 陈宇航 <yuhang.chen@foxmail.com> wrote:

> Hi all.
>
>
>     I'm testing on Spark 2.0.0 and found an issue when using coalesce in
> my code.
>
>     The procedure is simple doing a coalesce for a RDD[Stirng], and this
> happened:
>
> *       java.lang.NoSuchMethodError:
> org.apache.spark.rdd.RDD.coalesce(IZLscala/math/Ordering;)Lorg/apache/spark/rdd/RDD;
> *
>
>     The environment is *Spark 2.0.0* with *Scala 2.11.8*, while the
> application jar is compiled with * Spark 1.6.2 * and * Scala 2.10.6 *.
>
>     BTW, it works OK with *Spark 2.0.0* and *Scala 2.11.8*. So I presume
> it’s a compatibility issue?
>
>
> __
>
> Best regards.
>

Mime
View raw message