spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Subshiri S <subsh...@gmail.com>
Subject Lambda serialization
Date Wed, 29 Jul 2015 10:05:49 GMT
Hi, I have tried to use lambda expression in spark task, And it throws "
java.lang.IllegalArgumentException: Invalid lambda deserialization"
exception. It exception is thrown when I used the code like
"transform(pRDD->pRDD.map(t->t._2))" . The code snippet is below.


> JavaPairDStream<String,Integer> aggregate = pairRDD.reduceByKey((x,y)->x+y
> );
> JavaDStream<Integer> con = aggregate.transform(
> (Function<JavaPairRDD<String,Integer>, JavaRDD<Integer>>)pRDD->
pRDD.map(
> (Function<Tuple2<String,Integer>,Integer>)t->t._2));
>
>
JavaPairDStream<String,Integer> aggregate = pairRDD.reduceByKey((x,y)->x+y);
> JavaDStream<Integer> con = aggregate.transform(
> (Function<JavaPairRDD<String,Integer>, JavaRDD<Integer>> & Serializable)
> pRDD-> pRDD.map(
> (Function<Tuple2<String,Integer>,Integer> & Serializable)t->t._2));


The above two options didn't worked. Where as if I pass below object "f" as
the argument instead of lambda expression"t->t_.2". It works.

> Function f = *new* Function<Tuple2<String,Integer>,Integer>(){
> @Override
> *public* Integer call(Tuple2<String,Integer> paramT1) *throws* Exception {
> *return* paramT1._2;
> }
> };


May I know what is the right format to express that functions as a lambda
expression.

-Subshiri

Mime
View raw message