spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: multiple spark context in same driver program
Date Fri, 07 Nov 2014 10:10:00 GMT
My bad, I just fired up a spark-shell and created a new sparkContext and it
was working fine. I basically did a parallelize and collect with both
sparkContexts.

Thanks
Best Regards

On Fri, Nov 7, 2014 at 3:17 PM, Tobias Pfeiffer <tgp@preferred.jp> wrote:

> Hi,
>
> On Fri, Nov 7, 2014 at 4:58 PM, Akhil Das <akhil@sigmoidanalytics.com>
> wrote:
>>
>> That doc was created during the initial days (Spark 0.8.0), you can of
>> course create multiple sparkContexts in the same driver program now.
>>
>
> You sure about that? According to
> http://apache-spark-user-list.1001560.n3.nabble.com/Is-spark-context-in-local-mode-thread-safe-td7275.html
> (June 2014), "you currently can’t have multiple SparkContext objects in the
> same JVM".
>
> Tobias
>
>
>

Mime
View raw message