In Spark 1.6.0, we used

val jdbcDF =

for creating a data frame through hsbc.

In Spark 2.1.x, we have seen this isĀ 
val jdbcDF =

Does that mean we should not be using sqlContext going forward? Also, we see that sqlContext is not auto initialized while running spark-shell. Please advise, thanks

Best, Ravion