spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Aliaksei Litouka <aliaksei.lito...@gmail.com>
Subject Re: Error starting EC2 cluster
Date Thu, 08 May 2014 15:53:56 GMT
Well... the reason was an out-of-date version of Python (2.6.6) on the
machine where I ran the script. If anyone else experiences this issue -
just update your Python.


On Sun, May 4, 2014 at 7:51 PM, Aliaksei Litouka <aliaksei.litouka@gmail.com
> wrote:

> I am using Spark 0.9.1. When I'm trying to start a EC2 cluster with the
> spark-ec2 script, an error occurs and the following message is issued:
> AttributeError: 'module' object has no attribute 'check_output'. By this
> time, EC2 instances are up and running but Spark doesn't seem to be
> installed on them. Any ideas how to fix it?
>
> $ ./spark-ec2 -k my_key -i /home/alitouka/my_key.pem -s 1
> --region=us-east-1 --instance-type=m3.medium launch test_cluster
> Setting up security groups...
> Searching for existing cluster test_cluster...
> Don't recognize m3.medium, assuming type is pvm
> Spark AMI: ami-5bb18832
> Launching instances...
> Launched 1 slaves in us-east-1c, regid = r-XXXXXXXX
> Launched master in us-east-1c, regid = r-XXXXXXXX
> Waiting for instances to start up...
> Waiting 120 more seconds...
> Generating cluster's SSH key on master...
> ssh: connect to host ec2-XX-XXX-XXX-XX.compute-1.amazonaws.com port 22:
> Connection refused
> Error executing remote command, retrying after 30 seconds: Command
> '['ssh', '-o', 'StrictHostKeyChecking=no', '-i',
> '/home/alitouka/my_key.pem', '-t', '-t',
> u'root@ec2-XX-XXX-XXX-XX.compute-1.amazonaws.com', "\n      [ -f
> ~/.ssh/id_rsa ] ||\n        (ssh-keygen -q -t rsa -N '' -f ~/.ssh/id_rsa
> &&\n         cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys)\n    "]'
> returned non-zero exit status 255
> ssh: connect to host ec2-XX-XXX-XXX-XX.compute-1.amazonaws.com port 22:
> Connection refused
> Error executing remote command, retrying after 30 seconds: Command
> '['ssh', '-o', 'StrictHostKeyChecking=no', '-i',
> '/home/alitouka/my_key.pem', '-t', '-t',
> u'root@ec2-XX-XXX-XXX-XX.compute-1.amazonaws.com', "\n      [ -f
> ~/.ssh/id_rsa ] ||\n        (ssh-keygen -q -t rsa -N '' -f ~/.ssh/id_rsa
> &&\n         cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys)\n    "]'
> returned non-zero exit status 255
> ssh: connect to host ec2-XX-XXX-XXX-XX.compute-1.amazonaws.com port 22:
> Connection refused
> Error executing remote command, retrying after 30 seconds: Command
> '['ssh', '-o', 'StrictHostKeyChecking=no', '-i',
> '/home/alitouka/my_key.pem', '-t', '-t',
> u'root@ec2-XX-XXX-XXX-XX.compute-1.amazonaws.com', "\n      [ -f
> ~/.ssh/id_rsa ] ||\n        (ssh-keygen -q -t rsa -N '' -f ~/.ssh/id_rsa
> &&\n         cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys)\n    "]'
> returned non-zero exit status 255
> Warning: Permanently added 'ec2-XX-XXX-XXX-XX.compute-1.amazonaws.com,54.227.205.82'
> (RSA) to the list of known hosts.
> Connection to ec2-XX-XXX-XXX-XX.compute-1.amazonaws.com closed.
> Traceback (most recent call last):
>   File "./spark_ec2.py", line 806, in <module>
>     main()
>   File "./spark_ec2.py", line 799, in main
>     real_main()
>   File "./spark_ec2.py", line 684, in real_main
>     setup_cluster(conn, master_nodes, slave_nodes, opts, True)
>   File "./spark_ec2.py", line 419, in setup_cluster
>     dot_ssh_tar = ssh_read(master, opts, ['tar', 'c', '.ssh'])
>   File "./spark_ec2.py", line 624, in ssh_read
>     return subprocess.check_output(
> AttributeError: 'module' object has no attribute 'check_output'
>

Mime
View raw message