spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From kant kodali <kanth...@gmail.com>
Subject Questions on HDFS with Spark
Date Tue, 18 Apr 2017 23:07:58 GMT
Hi All,

I've been using spark standalone for a while and now its time for me to
install HDFS. If a spark worker goes down then Spark master restarts the
worker similarly if a  datanode process goes down it looks like it is not
the namenode job to restart the datanode and if so, 1)  should I use
process supervisor like monit for datanodes? 2) Also, is it a standard
process to colocate Spark master and Namenode on the same machine and
colocate SparkWorkers and DataNodes on the same machine ?

Thanks!

Mime
View raw message