Make sure your key pair is configured to access whatever region you're deploying to - it defaults to us-east-1, but you can provide a custom one with parameter --region.


On Sat, Aug 30, 2014 at 12:53 AM, David Matheson <david.j.matheson@gmail.com> wrote:
I'm following the latest documentation on configuring a cluster on ec2
(http://spark.apache.org/docs/latest/ec2-scripts.html).  Running
> ./spark-ec2 -k Blah -i .ssh/Blah.pem -s 2 launch spark-ec2-test
gets a generic timeout error that's coming from
  File "./spark_ec2.py", line 717, in real_main
    conn = ec2.connect_to_region(opts.region)

Any suggestions on how to debug the cause of the timeout?

Note: I replaced the name of my keypair with Blah.

Thanks,
David




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-ec2-Errno-110-Connection-time-out-tp13171.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org