spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hao Ren <>
Subject SparkContext and JavaSparkContext
Date Mon, 29 Jun 2015 09:15:03 GMT

I am working on legacy project using spark java code.

I have a function which takes sqlContext as an argument, however, I need a
JavaSparkContext in that function.

It seems that sqlContext.sparkContext() return a scala sparkContext.

I did not find any API for casting a scala sparkContext to a java one except

new JavaSparkContext(sqlContext.sparkContext())

I think it will create a new sparkContext. So there will be mutilple
sparkContext during run time.

According to some posts, there are some limitations on this. But I did not
encounter that.


What is the best way to cast a scala sparkContext to a java one ?
What problem will multiple sparkContext cause ?

Thank you. =)


View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message