spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michael Armbrust <mich...@databricks.com>
Subject Re: error: Unable to find encoder for type stored in a Dataset. when trying to map through a DataFrame
Date Wed, 02 Nov 2016 17:59:28 GMT
Spark doesn't know how to turn a Seq[Any] back into a row.  You would need
to create a case class or something where we can figure out the schema.
What are you trying to do?

If you don't care about specifics fields and you just want to serialize the
type you can use kryo:

implicit val anyEncoder = Encoders.kryo[Seq[Any]]

On Wed, Nov 2, 2016 at 9:57 AM, Daniel Haviv <
daniel.haviv@veracity-group.com> wrote:

> Hi,
> I have the following scenario:
>
> scala> val df = spark.sql("select * from danieltest3")
> df: org.apache.spark.sql.DataFrame = [iid: string, activity: string ... 34
> more fields]
>
> Now I'm trying to map through the rows I'm getting:
> scala> df.map(r=>r.toSeq)
> <console>:32: error: Unable to find encoder for type stored in a Dataset.
> Primitive types (Int, String, etc) and Product types (case classes) are
> supported by importing spark.implicits._  Support for serializing other
> types will be added in future releases.
>        df.map(r=>r.toSeq)
>
>
> What am I missing here ?
>
> Thank you,
> Daniel
>

Mime
View raw message