Interesting. Short term, maybe create the following file with pid 24922 ?
/tmp/spark-taoewang-org.apache.spark.deploy.worker.Worker-1.pid

Cheers

On Thu, Mar 12, 2015 at 6:51 PM, sequoiadb <mailing-list-recv@sequoiadb.com> wrote:
Nope, I can see the master file exist but not the worker:
$ ls
bitrock_installer.log
hsperfdata_root
hsperfdata_taoewang
omatmp
sbt2435921113715137753.log
spark-taoewang-org.apache.spark.deploy.master.Master-1.pid

在 2015年3月13日,上午9:34,Ted Yu <yuzhihong@gmail.com> 写道:

Does the machine have cron job that periodically cleans up /tmp dir ?

Cheers

On Thu, Mar 12, 2015 at 6:18 PM, sequoiadb <mailing-list-recv@sequoiadb.com> wrote:
Checking the script, it seems spark-daemon.sh unable to stop the worker
$ ./spark-daemon.sh stop org.apache.spark.deploy.worker.Worker 1
no org.apache.spark.deploy.worker.Worker to stop
$ ps -elf | grep spark
0 S taoewang 24922     1  0  80   0 - 733878 futex_ Mar12 ?       00:08:54 java -cp /data/sequoiadb-driver-1.10.jar,/data/spark-sequoiadb-0.0.1-SNAPSHOT.jar::/data/spark/conf:/data/spark/assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop2.4.0.jar -XX:MaxPermSize=128m -Dspark.deploy.recoveryMode=ZOOKEEPER -Dspark.deploy.zookeeper.url=centos-151:2181,centos-152:2181,centos-153:2181 -Dspark.deploy.zookeeper.dir=/data/zookeeper -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m org.apache.spark.deploy.worker.Worker spark://centos-151:7077,centos-152:7077,centos-153:7077

In spark-daemon script it tries to find $pid in /tmp/:
pid="$SPARK_PID_DIR/spark-$SPARK_IDENT_STRING-$command-$instance.pid”

In my case pid supposed to be: /tmp/spark-taoewang-org.apache.spark.deploy.worker.Worker-1.pid

However when I go through the files in /tmp directory I don’t find such file exist.
I got 777 on /tmp and also tried to touch a file with my current account and success, so it shouldn’t be permission issue.
$ ls -la / | grep tmp
drwxrwxrwx.   6 root     root      4096 Mar 13 08:19 tmp

Anyone has any idea why the pid file didn’t show up?

Thanks
TW

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org