spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Thomas Graves (Jira)" <>
Subject [jira] [Commented] (SPARK-29465) Unable to configure SPARK UI (spark.ui.port) in spark yarn cluster mode.
Date Wed, 16 Oct 2019 08:23:00 GMT


Thomas Graves commented on SPARK-29465:

Note the problem I see with just using the port if user specified is many users dont know
what they are doing in different environments and they result in random failures that they
dont necessarily understand. The default port is 4040 and not 0. So I'm a bit on the fence
if the solution here is purely use port when specified a specific port.  If ots a range of
ports that makes more sense.  So I would like to understand your use case and why you are
trying to specify specific port.

> Unable to configure SPARK UI (spark.ui.port) in spark yarn cluster mode. 
> -------------------------------------------------------------------------
>                 Key: SPARK-29465
>                 URL:
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Submit, YARN
>    Affects Versions: 3.0.0
>            Reporter: Vishwas Nalka
>            Priority: Major
>  I'm trying to restrict the ports used by spark app which is launched in yarn cluster
mode. All ports (viz. driver, executor, blockmanager) could be specified using the respective
properties except the ui port. The spark app is launched using JAVA code and setting the property
spark.ui.port in sparkConf doesn't seem to help. Even setting a JVM option -Dspark.ui.port="some_port"
does not spawn the UI is required port. 
> From the logs of the spark app, *_the property spark.ui.port is overridden and the JVM
property '-Dspark.ui.port=0' is set_* even though it is never set to 0. 
> _(Run in Spark 1.6.2) From the logs ->_
> _command:LD_LIBRARY_PATH="/usr/hdp/$LD_LIBRARY_PATH" {{JAVA_HOME}}/bin/java
-server -XX:OnOutOfMemoryError='kill %p' -Xms4096m -Xmx4096m{{PWD}}/tmp
'-Dspark.blockManager.port=9900' '-Dspark.driver.port=9902' '-Dspark.fileserver.port=9903'
'-Dspark.broadcast.port=9904' '-Dspark.port.maxRetries=20' '-Dspark.ui.port=0' '-Dspark.executor.port=9905'_
> _19/10/14 16:39:59 INFO Utils: Successfully started service 'SparkUI' on port 35167.19/10/14
16:39:59 INFO SparkUI: Started SparkUI at_ [_http://|]
> Even tried using a *spark-submit command with --conf spark.ui.port* does spawn UI in
required port
> {color:#172b4d}_(Run in Spark 2.4.4)_{color}
>  {color:#172b4d}_./bin/spark-submit --class org.apache.spark.examples.SparkPi --master
yarn --deploy-mode cluster --driver-memory 4g --executor-memory 2g --executor-cores 1 --conf
spark.ui.port=12345 --conf spark.driver.port=12340 --queue default examples/jars/spark-examples_2.11-2.4.4.jar
> _From the logs::_
>  _19/10/15 00:04:05 INFO ui.SparkUI: Stopped Spark web UI at [|]_
> _command:{{JAVA_HOME}}/bin/java -server -Xmx2048m{{PWD}}/tmp '-Dspark.ui.port=0' 
'Dspark.driver.port=12340'<LOG_DIR> -XX:OnOutOfMemoryError='kill
%p' org.apache.spark.executor.CoarseGrainedExecutorBackend --driver-url spark://
--executor-id <executorId> --hostname <hostname> --cores 1 --app-id application_1570992022035_0089
--user-class-path [file:$PWD/__app__.jar1|file://%24pwd/__app__.jar1]><LOG_DIR>/stdout2><LOG_DIR>/stderr_
> Looks like the application master override this and set a JVM property before launch
resulting in random UI port even though spark.ui.port is set by the user.
> In these links
>  # []
(line 214)
>  # []
(line 75)
> I can see that the method _*run() in above files sets a system property UI_PORT*_ and _*spark.ui.port

This message was sent by Atlassian Jira

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message