spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ☼ R Nair (रविशंकर नायर) <ravishankar.n...@gmail.com>
Subject sqlContext vs spark.
Date Fri, 03 Feb 2017 18:48:25 GMT
All,

In Spark 1.6.0, we used

val jdbcDF = sqlContext.read.format(-----)

for creating a data frame through hsbc.

In Spark 2.1.x, we have seen this is
val jdbcDF = *spark*.read.format(-----)

Does that mean we should not be using sqlContext going forward? Also, we
see that sqlContext is not auto initialized while running spark-shell.
Please advise, thanks

Best, Ravion

Mime
View raw message