spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marcelo Vanzin <van...@cloudera.com>
Subject Re: Using Spark Context as an attribute of a class cannot be used
Date Mon, 24 Nov 2014 23:21:23 GMT
That's an interesting question for which I do not know the answer.
Probably a question for someone with more knowledge of the internals
of the shell interpreter...

On Mon, Nov 24, 2014 at 2:19 PM, aecc <alessandroaecc@gmail.com> wrote:
> Ok, great, I'm gonna do do it that way, thanks :). However I still don't
> understand why this object should be serialized and shipped?
>
> aaa.s and sc are both the same object org.apache.spark.SparkContext@1f222881
>
> However this :
> aaa.s.parallelize(1 to 10).filter(_ == myNumber).count
>
> Needs to be serialized, and this:
>
> sc.parallelize(1 to 10).filter(_ == myNumber).count
>
> does not.


-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message