[ https://issues.apache.org/jira/browse/HIVE-19814?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16581448#comment-16581448
]
Bharathkrishna Guruvayoor Murali commented on HIVE-19814:
---------------------------------------------------------
[~stakiar] , Added an additional test to verify this, added list of ports to HiveConf and
check if the port assignment is done accordingly when the SparkSessionManager creates the
client.
> RPC Server port is always random for spark
> ------------------------------------------
>
> Key: HIVE-19814
> URL: https://issues.apache.org/jira/browse/HIVE-19814
> Project: Hive
> Issue Type: Bug
> Components: Spark
> Affects Versions: 2.3.0, 3.0.0, 2.4.0, 4.0.0
> Reporter: bounkong khamphousone
> Assignee: Bharathkrishna Guruvayoor Murali
> Priority: Major
> Attachments: HIVE-19814.1.patch, HIVE-19814.2.patch
>
>
> RPC server port is always a random one. In fact, the problem is in RpcConfiguration.HIVE_SPARK_RSC_CONFIGS
which doesn't include SPARK_RPC_SERVER_PORT.
>
> I've found this issue while trying to make hive-on-spark running inside docker.
>
> HIVE_SPARK_RSC_CONFIGS is called by HiveSparkClientFactory.initiateSparkConf > SparkSessionManagerImpl.setup
and the latter call SparkClientFactory.initialize(conf) which initialize the rpc server. This
RPCServer is then used to create the sparkClient which use the rpc server port as --remote-port
arg. Since initiateSparkConf ignore SPARK_RPC_SERVER_PORT, then it will always be a random
port.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
|