spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Burak Yavuz <brk...@gmail.com>
Subject Re: Guidelines for writing SPARK packages
Date Tue, 02 Feb 2016 00:48:38 GMT
Thanks for the reply David, just wanted to fix one part of your response:


> If you
> want to register a release for your package you will also need to push
> the artifacts for your package to Maven central.
>

It is NOT necessary to push to Maven Central in order to make a release.
There are many packages out there that don't publish to Maven Central, e.g.
scripts, and pure python packages.

Praveen, I would suggest taking a look at:
 - spark-package command line tool (
https://github.com/databricks/spark-package-cmd-tool), to get you set up
 - sbt-spark-package (https://github.com/databricks/sbt-spark-package) to
help with building/publishing if you plan to use Scala in your package. You
could of course use Maven as well, but we don't have a maven plugin for
Spark Packages.

Best,
Burak

Mime
View raw message