spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From moon soo Lee <>
Subject SparkContext and multi threads
Date Thu, 11 Sep 2014 21:23:05 GMT

I'm trying to make spark work on multithreads java application.
What i'm trying to do is,

- Create a Single SparkContext
- Create Multiple SparkILoop and SparkIMain
- Inject created SparkContext into SparkIMain interpreter.

Thread is created by every user request and take a SparkILoop and interpret
some code.

My problem is
 - If a thread take first SparkILoop instance, than everything works fine.
 - If a thread take other SparkILoop instance, Spark can not find closure /
case classes that i defined inside of interpreter.

I read some previous topic and I think it's related with SparkEnv and
ClosureCleaner. tried SparkEnv.set(env) with the env i can get right after
SparkContext created. i not still no class found exception.

Can anyone give me some idea?


View raw message