spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sun Rui <sunrise_...@163.com>
Subject Re: Spark 2.0 SparkSession, SparkConf, SparkContext
Date Wed, 27 Jul 2016 13:53:21 GMT
If you want to keep using RDD API, then you still need to create SparkContext first.

If you want to use just Dataset/DataFrame/SQL API, then you can directly create a SparkSession.
Generally the SparkContext is hidden although it is internally created and held within the
SparkSession. Anytime you need the SparkContext, you can get it from SparkSession.sparkContext.
  while SparkConf is accepted when creating a SparkSession, the formal way to set/get configurations
for a SparkSession is through SparkSession.conf.set()/get()
> On Jul 27, 2016, at 21:02, Jestin Ma <jestinwith.an.e@gmail.com> wrote:
> 
> I know that Sparksession is replacing the SQL and HiveContexts, but what about SparkConf
and SparkContext? Are those still relevant in our programs?
> 
> Thank you!
> Jestin



---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message