Hey lordjoe,

Apologies for the late reply. 

I followed your threadlocal approach and it worked fine. I will update the thread if I get to know more on this. 
(Don't know how Spark Scala does it but what I wanted to achieve in java is quiet common in many spark-scala github gists)

Thanks.


On Thu, Oct 23, 2014 at 3:08 PM, lordjoe <lordjoe2000@gmail.com> wrote:
 What I have been doing is building a JavaSparkContext the first time it is
needed and keeping it as a ThreadLocal - All my code uses
SparkUtilities.getCurrentContext(). On a Slave machine you build a new
context and don't have to serialize it
The code is in a large project at
https://code.google.com/p/distributed-tools/ - a work in progress but the
Spark aficionados on this list will say if the approach is Kosher

public class SparkUtilities extends Serializable
private transient static ThreadLocal<JavaSparkContext> threadContext;
    private static String appName = "Anonymous";

   public static String getAppName() {
        return appName;
    }

    public static void setAppName(final String pAppName) {
        appName = pAppName;
    }

    /**
     * create a JavaSparkContext for the thread if none exists
     *
     * @return
     */
    public static synchronized JavaSparkContext getCurrentContext() {
        if (threadContext == null)
            threadContext = new ThreadLocal<JavaSparkContext>();
        JavaSparkContext ret = threadContext.get();
        if (ret != null)
            return ret;
        SparkConf sparkConf = new SparkConf().setAppName(getAppName());

//   Here do operations you would do to initialize a context
        ret = new JavaSparkContext(sparkConf);

        threadContext.set(ret);
        return ret;
    }




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-access-objects-declared-and-initialized-outside-the-call-method-of-JavaRDD-tp17094p17150.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org




--
--Unilocal