spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From aminn_524 <>
Subject sparck Stdout and stderr
Date Fri, 04 Jul 2014 10:14:43 GMT

down vote
I am running spark-1.0.0 by connecting to a spark standalone cluster which
has one master and two slaves. I ran by Spark-submit, actually
it reads data from HDFS and also write the results into HDFS. So far
everything is fine and the results will correctly be written into HDFS. But
the thing makes me concern is that when I check Stdout for each worker, it
is empty I dont know whether it is suppose to be empty? and I got following
in stderr:

stderr log page for Some(app-20140704174955-0002)

Executor Command: "java" "-cp" "::
-XX:MaxPermSize=128m" "-Xms512M" "-Xmx512M"
" "akka.tcp://spark@master:54477/user/CoarseGrainedScheduler" "0" "slave2"
" "akka.tcp://sparkWorker@slave2:41483/user/Worker"

14/07/04 17:50:14 ERROR CoarseGrainedExecutorBackend: 
Driver Disassociated [akka.tcp://sparkExecutor@slave2:33758] -> 
[akka.tcp://spark@master:54477] disassociated! Shutting down.

View this message in context:
Sent from the Apache Spark User List mailing list archive at

View raw message