spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Liu, Raymond" <raymond....@intel.com>
Subject RE: Spark Master on Hadoop Job Tracker?
Date Tue, 21 Jan 2014 01:24:05 GMT
Not sure what did you aim to solve. When you mention Spark Master, I guess you probably mean
spark standalone mode? In that case spark cluster does not necessary coupled with hadoop cluster.
While if you aim to achieve better data locality , then yes, run spark worker on HDFS data
node might help. And for spark Master, I think that doesn't matter much.

Best Regards,
Raymond Liu

-----Original Message-----
From: mharwida [mailto:majdharwida@yahoo.com] 
Sent: Tuesday, January 21, 2014 2:14 AM
To: user@spark.incubator.apache.org
Subject: Spark Master on Hadoop Job Tracker?

Hi,

Should the Spark Master run on the Hadoop Job Tracker node (and Spark workers on Task Trackers)
or the placement of the Spark Master could reside on any Hadoop node?

Thanks
Majd



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Master-on-Hadoop-Job-Tracker-tp680.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Mime
View raw message