spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nikhil Goyal <nownik...@gmail.com>
Subject Understanding deploy mode config
Date Thu, 03 Oct 2019 04:19:48 GMT
Hi all,

In a pyspark application is the python process the driver or spark will
start a new driver process? If it is the same as driver then how does
specifying "spark.submit.deployMode" as "cluster" in  spark conf would come
in use.

conf = SparkConf()
            .setMaster("yarn")
            .set("spark.submit.deployMode", "cluster")
sc = SparkContext(conf)

Is the spark context being created on application master or on the machine
where this python process is being run?

Thanks
Nikhil

Mime
View raw message