spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Arush Kharbanda <>
Subject Re: is there a master for spark cluster in ec2
Date Thu, 29 Jan 2015 10:22:53 GMT
Hi Mohit,

You can set the master instance type with -m.

To setup a cluster you need to use the ec2/spark-ec2 script.

You need to create a AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY in your
aws web console under Security Credentials. And pass it on to script above.
Once you do that you should be able to setup your cluster using spark-ec2


On Thu, Jan 29, 2015 at 6:41 AM, Mohit Singh <> wrote:

> Hi,
>   Probably a naive question.. But I am creating a spark cluster on ec2
> using the ec2 scripts in there..
> But is there a master param I need to set..
> ./bin/pyspark --master [ ] ??
> I don't yet fully understand the ec2 concepts so just wanted to confirm
> this??
> Thanks
> --
> Mohit
> "When you want success as badly as you want the air, then you will get it.
> There is no other secret of success."
> -Socrates


[image: Sigmoid Analytics] <>

*Arush Kharbanda* || Technical Teamlead ||

View raw message