spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <>
Subject Re: SPARK-2243 Support multiple SparkContexts in the same JVM
Date Thu, 18 Dec 2014 10:04:00 GMT
Yes, although once you have multiple ClassLoaders, you are operating
as if in multiple JVMs for most intents and purposes. I think the
request for this kind of functionality comes from use cases where
multiple ClassLoaders wouldn't work, like, wanting to have one app (in
one ClassLoader) managing multiple contexts.

On Thu, Dec 18, 2014 at 2:23 AM, Anton Brazhnyk
<> wrote:
> Greetings,
> First comment on the issue says that reason for non-supporting of multiple
> contexts is
> “There are numerous assumptions in the code base that uses a shared cache or
> thread local variables or some global identifiers
> which prevent us from using multiple SparkContext's.”
> May it be worked around by creating those context in several classloaders
> with their own copies of Spark classes?
> Thanks,
> Anton

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message