spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tenghuan He <tenghua...@gmail.com>
Subject Custom RDD in spark, cannot find custom method
Date Sun, 27 Mar 2016 16:22:41 GMT
​Hi everyone,

    I am creating a custom RDD which extends RDD and add a custom method,
however the custom method cannot be found.
    The custom RDD looks like the following:

class MyRDD[K, V](
    var base: RDD[(K, V)],
    part: Partitioner
  ) extends RDD[(K, V)](base.context, Nil) {

  def *customMethod*(bulk: ArrayBuffer[(K, (V, Int))]): myRDD[K, V] = {
  // ... custom code here
  }

  override def compute(split: Partition, context: TaskContext):
Iterator[(K, V)] = {
  // ... custome code here
  }

  override protected def getPartitions: Array[Partition] = {
  // ... custom code here
  }

  override protected def getDependencies: Seq[Dependency[_]] = {
  // ... custom code here
  }
}​

In spark-shell, it turns out that the overrided methods works well, but
when calling myrdd.customMethod(bulk), it throws out:
<console>:33: error: value customMethod is not a member of
org.apache.spark.rdd.RDD[(In
t, String)]

Can anyone tell why the custom method can not be found?
Or do I have to add the customMethod to the abstract RDD and then override
it in custom RDD?

PS: spark-version: 1.5.1

Thanks & Best regards

Tenghuan

Mime
View raw message