spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Palash Gupta <>
Subject [Spark 2.1.0] Resource Scheduling Challenge in pyspark sparkSession
Date Thu, 05 Jan 2017 13:35:32 GMT
Hi User Team,
I'm trying to schedule resource in spark 2.1.0 using below code but still all the cpu cores
are captured by only single spark application and hence no other application is starting.
Could you please help me out:
sqlContext = SparkSession.builder.master("spark://").config("spark.sql.warehouse.dir",
"/tmp/PM/").config("spark.sql.shuffle.partitions", "6").config("spark.cores.max", "5").config("spark.executor.cores",
"2").config("spark.driver.memory", "8g").config("spark.executor.memory", "4g").appName(APP_NAME).getOrCreate()

Thanks & Best Regards,
Engr. Palash GuptaWhatsApp/Viber: +8801817181502Skype: palash2494


Thanks & Best Regards,Palash Gupta

View raw message