spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nikhil Goyal <nownik...@gmail.com>
Subject Re: Class cast exception while using Data Frames
Date Mon, 26 Mar 2018 21:55:58 GMT
 |-- myMap: map (nullable = true)
 |    |-- key: struct
 |    |-- value: double (valueContainsNull = true)
 |    |    |-- _1: string (nullable = true)
 |    |    |-- _2: string (nullable = true)
 |-- count: long (nullable = true)

On Mon, Mar 26, 2018 at 1:41 PM, Gauthier Feuillen <gauthier@dataroots.io>
wrote:

> Can you give the output of “printSchema” ?
>
>
> On 26 Mar 2018, at 22:39, Nikhil Goyal <nownikhil@gmail.com> wrote:
>
> Hi guys,
>
> I have a Map[(String, String), Double] as one of my columns. Using
>
> input.getAs[Map[(String, String), Double]](0)
>
>  throws exception: Caused by: java.lang.ClassCastException: org.apache.spark.sql.catalyst.expressions.GenericRowWithSchema
cannot be cast to scala.Tuple2
>
> Even the schema says that key is of type struct of (string, string).
>
> Any idea why this is happening?
>
>
> Thanks
>
> Nikhil
>
>
>

Mime
View raw message