spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Haopu Wang" <>
Subject HiveContext cannot be serialized
Date Mon, 16 Feb 2015 12:27:36 GMT
When I'm investigating this issue (in the end of this email), I take a
look at HiveContext's code and find this change


-  @transient protected[hive] lazy val hiveconf = new

-  @transient protected[hive] lazy val sessionState = {

-    val ss = new SessionState(hiveconf)

-    setConf(hiveconf.getAllProperties)  // Have SQLConf pick up the
initial set of HiveConf.

-    ss

-  }

+  @transient protected[hive] lazy val (hiveconf, sessionState) =

+    Option(SessionState.get())

+      .orElse {


With the new change, Scala compiler always generate a Tuple2 field of
HiveContext as below:


    private Tuple2 x$3;

    private transient OutputStream outputBuffer;

    private transient HiveConf hiveconf;

    private transient SessionState sessionState;

    private transient HiveMetastoreCatalog catalog;


That "x$3" field's key is HiveConf object that cannot be serialized. So
can you suggest how to resolve this issue? Thank you very much!




I have a streaming application which registered temp table on a
HiveContext for each batch duration.

The application runs well in Spark 1.1.0. But I get below error from

Do you have any suggestions to resolve it? Thank you! org.apache.hadoop.hive.conf.HiveConf

    - field (class "scala.Tuple2", name: "_1", type: "class

    - object (class "scala.Tuple2", (Configuration: core-default.xml,
core-site.xml, mapred-default.xml, mapred-site.xml, yarn-default.xml,
yarn-site.xml, hdfs-default.xml, hdfs-site.xml,

    - field (class "org.apache.spark.sql.hive.HiveContext", name: "x$3",
type: "class scala.Tuple2")

    - object (class "org.apache.spark.sql.hive.HiveContext",

    - field (class
"example.BaseQueryableDStream$$anonfun$registerTempTable$2", name:
"sqlContext$1", type: "class org.apache.spark.sql.SQLContext")

   - object (class

    - field (class
name: "foreachFunc$1", type: "interface scala.Function1")

    - object (class

    - field (class "org.apache.spark.streaming.dstream.ForEachDStream",
name: "org$apache$spark$streaming$dstream$ForEachDStream$$foreachFunc",
type: "interface scala.Function2")

    - object (class "org.apache.spark.streaming.dstream.ForEachDStream",

    - element of array (index: 0)

    - array (class "[Ljava.lang.Object;", size: 16)

    - field (class "scala.collection.mutable.ArrayBuffer", name:
"array", type: "class [Ljava.lang.Object;")

    - object (class "scala.collection.mutable.ArrayBuffer",

    - field (class "org.apache.spark.streaming.DStreamGraph", name:
"outputStreams", type: "class scala.collection.mutable.ArrayBuffer")

    - custom writeObject data (class

    - object (class "org.apache.spark.streaming.DStreamGraph",

    - field (class "org.apache.spark.streaming.Checkpoint", name:
"graph", type: "class org.apache.spark.streaming.DStreamGraph")

    - root object (class "org.apache.spark.streaming.Checkpoint",

    at Source)




  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message