spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Reynold Xin <r...@databricks.com>
Subject Re: About implicit rddToPairRDDFunctions
Date Thu, 13 Nov 2014 19:57:33 GMT
Do people usually important o.a.spark.rdd._ ?

Also in order to maintain source and binary compatibility, we would need to
keep both right?


On Thu, Nov 6, 2014 at 3:12 AM, Shixiong Zhu <zsxwing@gmail.com> wrote:

> I saw many people asked how to convert a RDD to a PairRDDFunctions. I would
> like to ask a question about it. Why not put the following implicit into
> "pacakge object rdd" or "object rdd"?
>
>   implicit def rddToPairRDDFunctions[K, V](rdd: RDD[(K, V)])
>       (implicit kt: ClassTag[K], vt: ClassTag[V], ord: Ordering[K] = null)
> = {
>     new PairRDDFunctions(rdd)
>   }
>
> If so, the converting will be automatic and not need to
> import org.apache.spark.SparkContext._
>
> I tried to search some discussion but found nothing.
>
> Best Regards,
> Shixiong Zhu
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message