spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From apu <apumishra...@gmail.com>
Subject Adding Hive support to existing SparkSession (or starting PySpark with Hive support)
Date Mon, 19 Dec 2016 17:58:16 GMT
This is for Spark 2.0:

If I wanted Hive support on a new SparkSession, I would build it with:

spark = SparkSession \
    .builder \
    .enableHiveSupport() \
    .getOrCreate()

However, PySpark already creates a SparkSession for me, which appears to
lack HiveSupport. How can I either:

(a) Add Hive support to an existing SparkSession,

or

(b) Configure PySpark so that the SparkSession it creates at startup has
Hive support enabled?

Thanks!

Apu

Mime
View raw message