spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shmuel Blitz <shmuel.bl...@similarweb.com>
Subject Re: Class cast exception while using Data Frames
Date Tue, 27 Mar 2018 06:51:20 GMT
Hi Nikhil,

Can you please put a code snippet that reproduces the issue?

Shmuel

On Tue, Mar 27, 2018 at 12:55 AM, Nikhil Goyal <nownikhil@gmail.com> wrote:

>  |-- myMap: map (nullable = true)
>  |    |-- key: struct
>  |    |-- value: double (valueContainsNull = true)
>  |    |    |-- _1: string (nullable = true)
>  |    |    |-- _2: string (nullable = true)
>  |-- count: long (nullable = true)
>
> On Mon, Mar 26, 2018 at 1:41 PM, Gauthier Feuillen <gauthier@dataroots.io>
> wrote:
>
>> Can you give the output of “printSchema” ?
>>
>>
>> On 26 Mar 2018, at 22:39, Nikhil Goyal <nownikhil@gmail.com> wrote:
>>
>> Hi guys,
>>
>> I have a Map[(String, String), Double] as one of my columns. Using
>>
>> input.getAs[Map[(String, String), Double]](0)
>>
>>  throws exception: Caused by: java.lang.ClassCastException: org.apache.spark.sql.catalyst.expressions.GenericRowWithSchema
cannot be cast to scala.Tuple2
>>
>> Even the schema says that key is of type struct of (string, string).
>>
>> Any idea why this is happening?
>>
>>
>> Thanks
>>
>> Nikhil
>>
>>
>>
>


-- 
Shmuel Blitz
Big Data Developer
Email: shmuel.blitz@similarweb.com
www.similarweb.com
<https://www.facebook.com/SimilarWeb/>
<https://www.linkedin.com/company/429838/> <https://twitter.com/similarweb>

Mime
View raw message