spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From tstewart <>
Subject sparkR 1.5.1 batch yarn-client mode failing on daemon.R not found
Date Wed, 04 Nov 2015 16:10:00 GMT
(apologies if this re-posts, having challenges with the various web front
ends to this mailing list)

I have the following script in a file named test.R:

sc <- sparkR.init(master="yarn-client")
sqlContext <- sparkRSQL.init(sc)
df <- createDataFrame(sqlContext, faithful)

If I submit this with "sparkR test.R" or "R  CMD BATCH test.R" or "Rscript
test.R" it fails with this error:
15/10/29 08:08:49 INFO r.BufferedStreamThread: Fatal error: cannot open file
No such file or directory
15/10/29 08:08:59 ERROR executor.Executor: Exception in task 0.0 in stage
1.0 (TID 1) Accept timed out

However, if I launch just an interactive sparkR shell and cut/paste those
commands - it runs fine.
It also runs fine on the same Hadoop cluster with Spark 1.4.1.
And, it runs fine from batch mode if I just use sparkR.init() and not

View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message