spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Shivaram Venkataraman (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-15294) Add pivot functionality to SparkR
Date Mon, 23 May 2016 21:42:12 GMT

    [ https://issues.apache.org/jira/browse/SPARK-15294?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15297146#comment-15297146
] 

Shivaram Venkataraman commented on SPARK-15294:
-----------------------------------------------

[~mhnatiuk] I think the code diff looks pretty good and you can go ahead and open a PR for
this. Opening a PR should be pretty simple if you follow the instructions at https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-ContributingCodeChanges
(See the section titled 'Pull Request' specifically).

Regarding whether it should be `sum(df$earnings)` - I'd like to think of it as a pointer to
the column that should be summed. Ideally we'd get it to work with just `earnings` (i.e. without
the need for df$), but that has some complications we haven't figured out yet.

> Add pivot functionality to SparkR
> ---------------------------------
>
>                 Key: SPARK-15294
>                 URL: https://issues.apache.org/jira/browse/SPARK-15294
>             Project: Spark
>          Issue Type: Improvement
>          Components: SparkR
>            Reporter: MikoĊ‚aj Hnatiuk
>            Priority: Minor
>              Labels: pivot
>
> R users are very used to transforming data using functions such as dcast (pkg:reshape2).
https://github.com/apache/spark/pull/7841 introduces such functionality to Scala and Python
APIs. I'd like to suggest adding this functionality into SparkR API to pivot DataFrames.
> I'd love to to this, however, my knowledge of Scala is still limited, but with a proper
guidance I can give it a try.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message