My bad, I just fired up a spark-shell and created a new sparkContext and it was working fine. I basically did a parallelize and collect with both sparkContexts.

Best Regards

On Fri, Nov 7, 2014 at 3:17 PM, Tobias Pfeiffer <> wrote:

On Fri, Nov 7, 2014 at 4:58 PM, Akhil Das <> wrote:
That doc was created during the initial days (Spark 0.8.0), you can of course create multiple sparkContexts in the same driver program now.

You sure about that? According to (June 2014), "you currently can’t have multiple SparkContext objects in the same JVM".