[ https://issues.apache.org/jira/browse/SPARK-15920?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15454634#comment-15454634
]
Piotr Milanowski commented on SPARK-15920:
------------------------------------------
That is why I used PySpark in components. This problem is present in python implementation.
> Using map on DataFrame
> ----------------------
>
> Key: SPARK-15920
> URL: https://issues.apache.org/jira/browse/SPARK-15920
> Project: Spark
> Issue Type: Bug
> Components: PySpark
> Affects Versions: 2.0.0
> Environment: branch-2.0
> Reporter: Piotr Milanowski
>
> In Spark 1.6 there was a method {{DataFrame.map}} as an alias to {{DataFrame.rdd.map}}.
In spark 2.0 this functionality no longer exists.
> Is there a preferred way of doing map on a DataFrame without explicitly calling {{DataFrame.rdd.map}}?
Maybe this functionality should be kept, just for backward compatibility purpose?
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org
|