spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jacek Laskowski <ja...@japila.pl>
Subject Why does sortByKey() transformation trigger a job in spark-shell?
Date Mon, 02 Nov 2015 13:34:29 GMT
Hi Sparkians,

I use the latest Spark 1.6.0-SNAPSHOT in spark-shell with the default
local[*] master.

I created an RDD of pairs using the following snippet:

val rdd = sc.parallelize(0 to 5).map(n => (n, util.Random.nextBoolean))

It's all fine so far. The map transformation causes no computation.

I thought all transformations are lazy and trigger no job until an
action's called. It seems I was wrong with sortByKey()! When I called
`rdd.sortByKey()`, it started a job: sortByKey at <console>:27 (!)

Can anyone explain what makes for the different behaviour of sortByKey
since it is a transformation and hence should be lazy? Is this a
special transformation?

Pozdrawiam,
Jacek

--
Jacek Laskowski | http://blog.japila.pl | http://blog.jaceklaskowski.pl
Follow me at https://twitter.com/jaceklaskowski
Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message