spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rohit Damkondwar <>
Subject Hive Metastore open connections even after closing Spark context and session
Date Tue, 15 Aug 2017 20:53:20 GMT
Hi. I am using Spark for querying Hive followed by transformations. My
Scala app creates multiple Spark Applications. A new spark context (and
session) is created only after closing previous SparkSession and Spark

However, on stopping sc and spark, somehow connections to Hive Metastore
(Mysql) are not destroyed properly. For every, Spark App I can see around 5
Mysql connections being created (old connections being still active!).
Eventually, Mysql starts rejecting new connections after 150 open
connections. How can I force spark to close Hive metastore connections to
Mysql (after spark.stop() and sc.stop())?

sc = spark context
spark = sparksession

Rohit S Damkondwar

View raw message