spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jeremy Freeman <freeman.jer...@gmail.com>
Subject Re: Should spark-ec2 get its own repo?
Date Mon, 10 Aug 2015 20:57:50 GMT
Hi all, definitely a +1 to this plan.

Wanted to also share this library for Spark + GCE by a collaborator of mine, Michael Broxton,
which seems to expand and improve on the earlier one Nick pointed us to. It’s pip installable,
not yet on spark-packages, but I’m sure he’d be game to add it.

https://github.com/broxtronix/spark-gce <https://github.com/broxtronix/spark-gce>

> On Aug 3, 2015, at 1:25 PM, Shivaram Venkataraman <shivaram@eecs.berkeley.edu>
wrote:
> 
> I sent a note to the Mesos developers and created
> https://github.com/apache/spark/pull/7899 to change the repository
> pointer. There are 3-4 open PRs right now in the mesos/spark-ec2
> repository and I'll work on migrating them to amplab/spark-ec2 later
> today.
> 
> My thoughts on moving the python script is that we should have a
> wrapper shell script that just fetches the latest version of
> spark_ec2.py for the corresponding Spark branch. We already have
> separate branches in our spark-ec2 repository for different Spark
> versions so it can just be a call to `wget
> https://github.com/amplab/spark-ec2/tree/<spark-version>/driver/spark_ec2.py`.
> 
> Thanks
> Shivaram
> 
> On Sun, Aug 2, 2015 at 11:34 AM, Nicholas Chammas
> <nicholas.chammas@gmail.com> wrote:
>> On Sat, Aug 1, 2015 at 1:09 PM Matt Goodman <meawoppl@gmail.com> wrote:
>>> 
>>> I am considering porting some of this to a more general spark-cloud
>>> launcher, including google/aliyun/rackspace.  It shouldn't be hard at all
>>> given the current approach for setup/install.
>> 
>> 
>> FWIW, there are already some tools for launching Spark clusters on GCE and
>> Azure:
>> 
>> http://spark-packages.org/?q=tags%3A%22Deployment%22
>> 
>> Nick
>> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
> 


Mime
View raw message