spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Likith_Kailas <>
Subject SparkStreaming connection exception
Date Thu, 24 Aug 2017 13:05:17 GMT
I have written a unit test which uses multithreading to start and stop
Sparkstreamingjob and kafkaproducer. All the dependencies have been declared
in maven pom.xml file.

When i run the test, once the all the kafka messages are read and the
threads are stopped i continue to get the below exception

 2017-08-19 17:08:16,783 INFO  [Executor task launch worker-0-
 SendThread(] zookeeper.ClientCnxn 
( - Opening socket connection to 
server Will not attempt to authenticate using 
SASL (unknown error)
2017-08-19 17:08:17,786 WARN  [Executor task launch worker-0-
SendThread(] zookeeper.ClientCnxn
( - Session 0x15dfb08227f0001 for server null,
unexpected error, closing socket connection and attempting reconnect Connection refused: no further information
at Method)
at org.apache.zookeeper.ClientCnxn$

The code is as below :

    public void someKafkaTest() {

    try {

        //Thread controlling the Spark streaming
        Thread sparkStreamerThread = new Thread(
                new SparkStreamingJSonJob(new String[] { zookeeperConnect,
"my-consumer-group", "test", "1" }),


        //Thread to start the producer
        Thread producerThread = new Thread(new KafkaJSonProducer(),

        //current kafkaTest thread to sleep for 1 second


        int sparkAccVal = SparkStreamingJSonJob.getAccumulator().intValue();
        System.out.println("Spark Throughput value : " + sparkAccVal/60);

        while (sparkStreamerThread.isAlive())
       } catch (Exception e) {


I feel that the streamingjob continues to run even after stopping the
thread. Please help me on this.

View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe e-mail:

View raw message