[ https://issues.apache.org/jira/browse/SPARK-17774?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15547429#comment-15547429
]
Oscar D. Lara Yejas commented on SPARK-17774:
---------------------------------------------
[~shivaram]: I concur with Shivaram. Besides, I already implemented method head() in my PR
11336:
https://github.com/apache/spark/pull/11336
If you wanted to implement method head() alone, you'll still need to do all changes I did
for PR 11336 except for the 5 lines of code of method collect(). If that's the case, I'd rather
suggest to merge PR 11336.
[~falaki]: In the corner cases where there's no parent DataFrame, we can return an empty value
as opposed to throwing an error. This behavior is already implemented in PR 11336. Also, though
R doesn't have method collect(), I think it's still useful to turn a Column into an R vector.
Perhaps a function called as.vector()?
Thanks folks!
> Add support for head on DataFrame Column
> ----------------------------------------
>
> Key: SPARK-17774
> URL: https://issues.apache.org/jira/browse/SPARK-17774
> Project: Spark
> Issue Type: Sub-task
> Components: SparkR
> Affects Versions: 2.0.0
> Reporter: Hossein Falaki
>
> There was a lot of discussion on SPARK-9325. To summarize the conversation on that ticket
regardign {{collect}}
> * Pro: Ease of use and maximum compatibility with existing R API
> * Con: We do not want to increase maintenance cost by opening arbitrary API. With Spark's
DataFrame API {{collect}} does not work on {{Column}} and there is no need for it to work
in R.
> This ticket is strictly about {{head}}. I propose supporting {{head}} on {{Column}} because:
> 1. R users are already used to calling {{head(iris$Sepal.Length)}}. When they do that
on SparkDataFrame they get an error. Not a good experience
> 2. Adding support for it does not require any change to the backend. It can be trivially
done in R code.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org
|