spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <>
Subject Re: Accessing spark context from executors?
Date Fri, 01 Aug 2014 10:33:31 GMT
You should use sc.hadoopConfiguration to get the Hadoop configuration.
Making a new one just gets you default values, which may work for your
purposes, but probably not as ideal. This configuration object should
be something you can send in the closure.

On Fri, Aug 1, 2014 at 2:16 AM, Sung Hwan Chung
<> wrote:
> Is there any way to get SparkContext object from executor? Or hadoop
> configuration, etc. The reason is that I would like to write to HDFS from
> executors.

View raw message