systemml-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Niketan Pansare" <npan...@us.ibm.com>
Subject Re: Scala support ?
Date Wed, 28 Sep 2016 21:29:36 GMT

That is correct. Again, it is a good idea to make the scala version
explicit either in jar naming or in the release notes.

I am not sure what is the recommended practice in Spark community for
developing applications.
- Should one release two jars with explicit scala versions (scala 2.10 and
scala 2.11) and let user download the correct version OR
- Only release one jar (by sticking to some rule: "scala version matches
the default scala version of the supported spark version") and provide
instructions to compile with different scala version. Spark follows this
option.

For compiling with different scala version, isn't it as simple as providing
a flag (-Dscala.version=2.11) to mvn rather than modifying the pom itself ?

Thanks,

Niketan Pansare
IBM Almaden Research Center
E-mail: npansar At us.ibm.com
http://researcher.watson.ibm.com/researcher/view.php?person=us-npansar



From:	Luciano Resende <luckbr1975@gmail.com>
To:	dev@systemml.incubator.apache.org
Date:	09/28/2016 01:19 PM
Subject:	Re: Scala support ?



On Wed, Sep 28, 2016 at 12:55 PM, Niketan Pansare <npansar@us.ibm.com>
wrote:

> I think making scala version explicit is a good idea. Implicitly we are
> consistent with spark version supported in the release.
>
>
Implicitly only if the user does not choose to build Spark release with
Scala 2.11, which then will be a mismatch and the user will have to hack
it's way to be able to change the necessary places in the pom and rebuild
it.



--
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/



Mime
  • Unnamed multipart/related (inline, None, 0 bytes)
View raw message