Hi All,

I started using spark 2.2.0 very recently and now I can't even get the json data from Kafka out to console. I have no clue what's happening. This was working for me when I was using 2.1.1

Here is my code 

StreamingQuery query = sparkSession.readStream()
        .format("kafka")
        .option("kafka.bootstrap.servers", "localhost:9092")
        .option("subscribe", "hello"))
        .option("startingOffsets", "earliest")
        .load()
        .writeStream()
        .format("console")
        .start();

query.awaitTermination();

I am using Kafka 0.11

And here are dependencies

def sparkVersion = '2.2.0'
compile group: 'org.apache.spark', name: 'spark-core_2.11', version: sparkVersion
compile group: 'org.apache.spark', name: 'spark-streaming_2.11', version: sparkVersion
compile group: 'org.apache.spark', name: 'spark-sql_2.11', version: sparkVersion
compile group: 'com.datastax.spark', name: 'spark-cassandra-connector_2.11', version: '2.0.0-M3'
compile group: 'org.apache.spark', name: 'spark-streaming-kafka-0-10_2.11', version: sparkVersion
compile group: 'org.apache.spark', name: 'spark-sql-kafka-0-10_2.11', version: sparkVersion
compile group: 'org.apache.kafka', name: 'kafka-clients', version: '0.10.0.1'
compile group: 'org.mongodb.spark', name: 'mongo-spark-connector_2.11', version: sparkVersion
compile 'org.mongodb:mongodb-driver:3.0.4'

Attached are the INFO and Trace Logs
Any help or pointers will be great! 
Thanks