spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matei Zaharia <>
Subject Re: Spark on Scala 2.11
Date Tue, 13 May 2014 01:27:36 GMT
We can build the REPL separately for each version of Scala, or even give that package a different
name in Scala 2.11.

Scala 2.11’s REPL actually added two flags, -Yrepl-class-based and -Yrepl-outdir, that encompass
the two modifications we made to the REPL (using classes instead of objects to wrap each line,
and grabbing the files from some directory). So it might be possible to run it without modifications
using just a simple wrapper class around it. That would definitely simplify things!

BTW did the non-REPL parts run fine on 2.11?


On May 12, 2014, at 2:09 PM, Anand Avati <> wrote:

> Matei,
> Thanks for confirming. I was looking specifically at the REPL part and how
> it can be significantly simplified with 2.11 Scala, without having to
> inherit a full copy of a refactored repl inside Spark. I am happy to
> investigate/contribute a simpler 2.11 based REPL if this is were seen as a
> priority (1.1 does not seem "too far" away.) However a 2.10 compatible
> cross build would still require a separate (existing) REPL code for the
> 2.10 build, no?
> Thanks.
> On Sun, May 11, 2014 at 2:08 PM, Matei Zaharia <>wrote:
>> We do want to support it eventually, possibly as early as Spark 1.1 (which
>> we’d cross-build on Scala 2.10 and 2.11). If someone wants to look at it
>> before, feel free to do so! Scala 2.11 is very close to 2.10 so I think
>> things will mostly work, except for possibly the REPL (which has require
>> porting over code form the Scala REPL in each version).
>> Matei
>> On May 8, 2014, at 6:33 PM, Anand Avati <> wrote:
>>> Is there an ongoing effort (or intent) to support Spark on Scala 2.11?
>>> Approximate timeline?
>>> Thank

View raw message