spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From thbeh <th...@thbeh.com>
Subject Spark SQL 1.6.1 issue
Date Thu, 18 Aug 2016 04:05:10 GMT
Running the query below I have been hitting - local class incompatible
exception, anyone know the cause?

val rdd = csc.cassandraSql("""select *, concat('Q', d_qoy) as qoy from
store_sales join date_dim on ss_sold_date_sk = d_date_sk join item on
ss_item_sk =
i_item_sk""").groupBy("i_category").pivot("qoy").agg(round(sum("ss_sales_price")/1000000,2))

The source data is from TPCDS test data and I am running in Zeppelin.


/INFO [2016-08-18 03:15:58,429] ({task-result-getter-2}
Logging.scala[logInfo]:58) - Lost task 3.0 in stage 3.0 (TID 52) on executor
ceph5.example.my: java.io.InvalidClassException
(org.apache.spark.sql.catalyst.expressions.Literal; local class
incompatible: stream classdesc serialVersionUID = 3305180847846277455, local
class serialVersionUID = -4259705229845269663) [duplicate 1]
 INFO [2016-08-18 03:15:58,429] ({task-result-getter-3}
Logging.scala[logInfo]:58) - Lost task 2.0 in stage 3.0 (TID 51) on executor
ceph5.example.my: java.io.InvalidClassException
(org.apache.spark.sql.catalyst.expressions.Literal; local class
incompatible: stream classdesc serialVersionUID = 3305180847846277455, local
class serialVersionUID = -4259705229845269663) [duplicate 2]
 INFO [2016-08-18 03:15:58,430] ({task-result-getter-3}
Logging.scala[logInfo]:58) - Lost task 6.0 in stage 3.0 (TID 55) on executor
ceph5.example.my: java.io.InvalidClassException
(org.apache.spark.sql.catalyst.expressions.Literal; local class
incompatible: stream classdesc serialVersionUID = 3305180847846277455, local
class serialVersionUID = -4259705229845269663) [duplicate 3]/

Thanks



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-1-6-1-issue-tp27554.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message