spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sun Rui (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-14751) SparkR fails on Cassandra map with numeric key
Date Thu, 12 May 2016 13:30:13 GMT

    [ https://issues.apache.org/jira/browse/SPARK-14751?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15281516#comment-15281516
] 

Sun Rui commented on SPARK-14751:
---------------------------------

if the key type is integer or long, toString works.
[~shivaram] how about add this workaround?

> SparkR fails on Cassandra map with numeric key
> ----------------------------------------------
>
>                 Key: SPARK-14751
>                 URL: https://issues.apache.org/jira/browse/SPARK-14751
>             Project: Spark
>          Issue Type: Bug
>          Components: SparkR
>    Affects Versions: 1.6.1
>            Reporter: Michał Matłoka
>
> Hi,
> I have created an issue for spark  cassandra connector ( https://datastax-oss.atlassian.net/projects/SPARKC/issues/SPARKC-366
) but after a bit of digging it seems this is a better place for this issue:
> {code}
> CREATE TABLE test.map (
>     id text,
>     somemap map<tinyint, decimal>,
>     PRIMARY KEY (id)
> );
> insert into test.map(id, somemap) values ('a', { 0 : 12 }); 
> {code}
> {code}
>   sqlContext <- sparkRSQL.init(sc)
>   test <-read.df(sqlContext,  source = "org.apache.spark.sql.cassandra",  keyspace
= "test", table = "map")
>   head(test)
> {code}
> Results in:
> {code}
> 16/04/19 14:47:02 ERROR RBackendHandler: dfToCols on org.apache.spark.sql.api.r.SQLUtils
failed
> Error in readBin(con, raw(), stringLen, endian = "big") :
>   invalid 'n' argument
> {code}
> Problem occurs even for int key. For text key it works. Every scenario works under scala
& python.
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message