spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Palash Gupta <spline_pal...@yahoo.com.INVALID>
Subject [Spark 2.1.0] Resource Scheduling Challenge in pyspark sparkSession
Date Thu, 05 Jan 2017 13:35:32 GMT
Hi User Team,
I'm trying to schedule resource in spark 2.1.0 using below code but still all the cpu cores
are captured by only single spark application and hence no other application is starting.
Could you please help me out:
sqlContext = SparkSession.builder.master("spark://172.26.7.192:7077").config("spark.sql.warehouse.dir",
"/tmp/PM/").config("spark.sql.shuffle.partitions", "6").config("spark.cores.max", "5").config("spark.executor.cores",
"2").config("spark.driver.memory", "8g").config("spark.executor.memory", "4g").appName(APP_NAME).getOrCreate()


Thanks & Best Regards,
Engr. Palash GuptaWhatsApp/Viber: +8801817181502Skype: palash2494

 

Thanks & Best Regards,Palash Gupta

Mime
View raw message