spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Daniel Haviv <daniel.ha...@veracity-group.com>
Subject error: Unable to find encoder for type stored in a Dataset. when trying to map through a DataFrame
Date Wed, 02 Nov 2016 16:57:38 GMT
Hi,
I have the following scenario:

scala> val df = spark.sql("select * from danieltest3")
df: org.apache.spark.sql.DataFrame = [iid: string, activity: string ... 34
more fields]

Now I'm trying to map through the rows I'm getting:
scala> df.map(r=>r.toSeq)
<console>:32: error: Unable to find encoder for type stored in a Dataset.
Primitive types (Int, String, etc) and Product types (case classes) are
supported by importing spark.implicits._  Support for serializing other
types will be added in future releases.
       df.map(r=>r.toSeq)


What am I missing here ?

Thank you,
Daniel

Mime
View raw message