spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From <>
Subject RE: Spark #cores
Date Wed, 18 Jan 2017 15:33:08 GMT
Are you talking here of Spark SQL ?
If yes, spark.sql.shuffle.partitions needs to be changed.

From: Saliya Ekanayake []
Sent: Wednesday, January 18, 2017 8:56 PM
To: User <>
Subject: Spark #cores


I am running a Spark application setting the number of executor cores 1 and a default parallelism
of 32 over 8 physical nodes.

The web UI shows it's running on 200 cores. I can't relate this number to the parameters I've
used. How can I control the parallelism in a more deterministic way?

Thank you,

Saliya Ekanayake, Ph.D
Applied Computer Scientist
Network Dynamics and Simulation Science Laboratory (NDSSL)
Virginia Tech, Blacksburg


This message is for the designated recipient only and may contain privileged, proprietary,
or otherwise confidential information. If you have received it in error, please notify the
sender immediately and delete the original. Any other use of the e-mail by you is prohibited.
Where allowed by local law, electronic communications with Accenture and its affiliates, including
e-mail and instant messaging (including content), may be scanned by our systems for the purposes
of information security and assessment of internal compliance with Accenture policy.
View raw message