spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Adrian Mocanu <amoc...@verticalscope.com>
Subject RE: "overloaded method value updateStateByKey ... cannot be applied to ..." when Key is a Tuple2
Date Wed, 12 Nov 2014 19:43:00 GMT
My understanding is that the reason you have an Option is so you could filter out tuples when
None is returned. This way your state data won't grow forever.

-----Original Message-----
From: spr [mailto:spr@yarcdata.com] 
Sent: November-12-14 2:25 PM
To: user@spark.incubator.apache.org
Subject: Re: "overloaded method value updateStateByKey ... cannot be applied to ..." when
Key is a Tuple2

After comparing with previous code, I got it work by making the return a Some instead of Tuple2.
 Perhaps some day I will understand this.


spr wrote
> ------code--------
> 
>     val updateDnsCount = (values: Seq[(Int, Time)], state: 
> Option[(Int,
> Time)]) => { 
>       val currentCount = if (values.isEmpty) 0 else values.map( x => 
> x._1).sum
>       val newMinTime = if (values.isEmpty) Time(Long.MaxValue) else 
> values.map( x => x._2).min
> 
>       val (previousCount, minTime) = state.getOrElse((0,
> Time(System.currentTimeMillis)))
> 
>       //  (currentCount + previousCount, Seq(minTime, newMinTime).min)    
> <== old
>       Some(currentCount + previousCount, Seq(minTime, newMinTime).min) 
> // <== new
>     }





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/overloaded-method-value-updateStateByKey-cannot-be-applied-to-when-Key-is-a-Tuple2-tp18644p18750.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org For additional commands, e-mail:
user-help@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message