spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Dave DeCaprio (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-26988) Spark overwrites spark.scheduler.pool if set in configs
Date Tue, 26 Feb 2019 15:12:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-26988?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16778032#comment-16778032
] 

Dave DeCaprio commented on SPARK-26988:
---------------------------------------

Yes, it would be an issue for any property that starts with "spark".

In my case I was able to work around the issue by removing the spark.scheduler.pool property
from my configuration, so the issue isn't urgent for me, but I did want to note it.

> Spark overwrites spark.scheduler.pool if set in configs
> -------------------------------------------------------
>
>                 Key: SPARK-26988
>                 URL: https://issues.apache.org/jira/browse/SPARK-26988
>             Project: Spark
>          Issue Type: Improvement
>          Components: Scheduler
>    Affects Versions: 2.4.0
>            Reporter: Dave DeCaprio
>            Priority: Minor
>
> If you set a default spark.scheduler.pool in your configuration when you create a SparkSession
and then you attempt to override that configuration by calling setLocalProperty on a SparkSession,
as described in the Spark documentation - [https://spark.apache.org/docs/latest/job-scheduling.html#fair-scheduler-pools] -
it won't work.
> Spark will go with the original pool name.
> I've traced this down to SQLExecution.withSQLConfPropagated, which copies any key that
starts with "spark" from the the session state to the local properties.  The can end up overwriting
the scheduler, which is set by spark.scheduler.pool



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message