spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Antony Mayi <antonym...@yahoo.com.INVALID>
Subject custom python converter from HBase Result to tuple
Date Mon, 22 Dec 2014 19:02:39 GMT
Hi,

can anyone please give me some help how to write custom converter of hbase data to (for example)
tuples of ((family, qualifier, value), ) for pyspark:

I was trying something like (here trying to tuples of ("family:qualifier:value", )):


class HBaseResultToTupleConverter extends Converter[Any, List[String]] {
  override def convert(obj: Any): List[String] = {
    val result = obj.asInstanceOf[Result]
    result.rawCells().map(cell => List(Bytes.toString(CellUtil.cloneFamily(cell)),
      Bytes.toString(CellUtil.cloneQualifier(cell)),
      Bytes.toString(CellUtil.cloneValue(cell))).mkString(":")
    ).toList
  }
}


but then I get a error:

14/12/22 16:27:40 WARN python.SerDeUtil: 
Failed to pickle Java object as value: $colon$colon, falling back
to 'toString'. Error: couldn't introspect javabean: java.lang.IllegalArgumentException: wrong
number of arguments


does anyone have a hint?

Thanks,
Antony.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message