spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shashank Mandil <mandil.shash...@gmail.com>
Subject Local spark context on an executor
Date Tue, 21 Mar 2017 22:34:48 GMT
Hi All,

I am using spark in a yarn cluster mode.
When I run a yarn application it creates multiple executors on the hadoop
datanodes for processing.

Is it possible for me to create a local spark context (master=local) on
these executors to be able to get a spark context ?

Theoretically since each executor is a java process this should be doable
isn't it ?

Thanks,
Shashank

Mime
View raw message