hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Nemon Lou (JIRA)" <>
Subject [jira] [Commented] (HIVE-12538) After set spark related config, SparkSession never get reused
Date Sat, 28 Nov 2015 10:16:10 GMT


Nemon Lou commented on HIVE-12538:

Actually,there are two bugs:
1, Property isSparkConfigUpdated of always get updated when setting values from
client side(beeline).
    set spark.yarn.queue=QueueA; ---> isSparkConfigUpdated =true
    set hive.execution.engine=spark; ---> isSparkConfigUpdated =false
2, SparkTask uses an operation level conf object other than session level conf.
That makes "conf.setSparkConfigUpdated(false);" in SparkUtilities meaningless from session's

> After set spark related config, SparkSession never get reused
> -------------------------------------------------------------
>                 Key: HIVE-12538
>                 URL:
>             Project: Hive
>          Issue Type: Bug
>          Components: Spark
>    Affects Versions: 1.3.0
>            Reporter: Nemon Lou
> Hive on Spark yarn-cluster mode.
> After setting "set spark.yarn.queue=QueueA;" ,
> run the query "select count(*) from test"  3 times and you will find  3 different yarn
> Two of the yarn applications in FINISHED & SUCCEEDED state,and one in RUNNING &
UNDEFINED state waiting for next work.
> And if you submit one more "select count(*) from test" ,the third one will be in FINISHED
& SUCCEEDED state and a new yarn application will start up.

This message was sent by Atlassian JIRA

View raw message