spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nicholas Chammas <nicholas.cham...@gmail.com>
Subject Re: spark 1.2 ec2 launch script hang
Date Tue, 27 Jan 2015 17:19:21 GMT
For those who found that absolute vs. relative path for the pem file
mattered, what OS and shell are you using? What version of Spark are you
using?

~/ vs. absolute path shouldn’t matter. Your shell will expand the ~/ to the
absolute path before sending it to spark-ec2. (i.e. tilde expansion.)

Absolute vs. relative path (e.g. ../../path/to/pem) also shouldn’t matter,
since we fixed that for Spark 1.2.0
<https://issues.apache.org/jira/browse/SPARK-4137>. Maybe there’s some case
that we missed?

Nick

On Tue Jan 27 2015 at 10:10:29 AM Charles Feduke <charles.feduke@gmail.com>
wrote:

Absolute path means no ~ and also verify that you have the path to the file
> correct. For some reason the Python code does not validate that the file
> exists and will hang (this is the same reason why ~ hangs).
> On Mon, Jan 26, 2015 at 10:08 PM Pete Zybrick <pzybrick@gmail.com> wrote:
>
>> Try using an absolute path to the pem file
>>
>>
>>
>> > On Jan 26, 2015, at 8:57 PM, ey-chih chow <eychih@hotmail.com> wrote:
>> >
>> > Hi,
>> >
>> > I used the spark-ec2 script of spark 1.2 to launch a cluster.  I have
>> > modified the script according to
>> >
>> > https://github.com/grzegorz-dubicki/spark/commit/5dd8458d2ab
>> 9753aae939b3bb33be953e2c13a70
>> >
>> > But the script was still hung at the following message:
>> >
>> > Waiting for cluster to enter 'ssh-ready'
>> > state.............................................
>> >
>> > Any additional thing I should do to make it succeed?  Thanks.
>> >
>> >
>> > Ey-Chih Chow
>> >
>> >
>> >
>> > --
>> > View this message in context: http://apache-spark-user-list.
>> 1001560.n3.nabble.com/spark-1-2-ec2-launch-script-hang-tp21381.html
>> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> > For additional commands, e-mail: user-help@spark.apache.org
>> >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>  ​

Mime
View raw message