spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hao Ren <inv...@gmail.com>
Subject SparkContext and JavaSparkContext
Date Mon, 29 Jun 2015 09:15:03 GMT
Hi, 

I am working on legacy project using spark java code.

I have a function which takes sqlContext as an argument, however, I need a
JavaSparkContext in that function.

It seems that sqlContext.sparkContext() return a scala sparkContext.

I did not find any API for casting a scala sparkContext to a java one except
:

new JavaSparkContext(sqlContext.sparkContext())

I think it will create a new sparkContext. So there will be mutilple
sparkContext during run time.

According to some posts, there are some limitations on this. But I did not
encounter that.

Question:

What is the best way to cast a scala sparkContext to a java one ?
What problem will multiple sparkContext cause ?

Thank you. =)

Hao



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-and-JavaSparkContext-tp23525.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message