spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Ash <and...@andrewash.com>
Subject Re: Is their a way to Create SparkContext object?
Date Wed, 14 May 2014 02:26:41 GMT
SparkContext is not serializable, so you can't send it across the cluster
like the rdd.map(t => compute(sc, t._2)) would do.

There is likely a way to express what you're trying to do with an algorithm
that doesn't require serializing SparkContext.  Can you tell us more about
your goals?

Andrew


On Tue, May 13, 2014 at 2:14 AM, yh18190 <yh18190@gmail.com> wrote:

> Thanks Mateh Zahria.Can i pass it as a parameter as part of closures.
> for example
> RDD.map(t=>compute(sc,t._2))
>
> can I use sc inside map function?Pls let me know
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Is-their-a-way-to-Create-SparkContext-object-tp5612p5647.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Mime
View raw message