spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Qiao, Richard" <Richard.Q...@capitalone.com>
Subject Re: unable to connect to connect to cluster 2.2.0
Date Wed, 06 Dec 2017 08:35:02 GMT
Are you now building your app using spark 2.2 or 2.1?

Best Regards
Richard


From: Imran Rajjad <rajjad@gmail.com>
Date: Wednesday, December 6, 2017 at 2:45 AM
To: "user @spark" <user@spark.apache.org>
Subject: unable to connect to connect to cluster 2.2.0

Hi,

Recently upgraded from 2.1.1 to 2.2.0. My Streaming job seems to have broken. The submitted
application is unable to connect to the cluster, when all is running.

below is my stack trace
Spark Master:spark://192.168.10.207:7077<http://192.168.10.207:7077>
Job Arguments:
-appName orange_watch -directory /u01/watch/stream/
Spark Configuration:
[spark.executor.memory, spark.driver.memory, spark.app.name<http://spark.app.name>,
spark.executor.cores]:6g
[spark.executor.memory, spark.driver.memory, spark.app.name<http://spark.app.name>,
spark.executor.cores]:4g
[spark.executor.memory, spark.driver.memory, spark.app.name<http://spark.app.name>,
spark.executor.cores]:orange_watch
[spark.executor.memory, spark.driver.memory, spark.app.name<http://spark.app.name>,
spark.executor.cores]:2
Spark Arguments:
[--packages]:graphframes:graphframes:0.5.0-spark2.1-s_2.11
Using properties file: /home/my_user/spark-2.2.0-bin-hadoop2.7/conf/spark-defaults.conf
Adding default property: spark.jars.packages=graphframes:graphframes:0.5.0-spark2.1-s_2.11
Parsed arguments:
  master                  spark://192.168.10.207:7077<http://192.168.10.207:7077>
  deployMode              null
  executorMemory          6g
  executorCores           2
  totalExecutorCores      null
  propertiesFile          /home/my_user/spark-2.2.0-bin-hadoop2.7/conf/spark-defaults.conf
  driverMemory            4g
  driverCores             null
  driverExtraClassPath    null
  driverExtraLibraryPath  null
  driverExtraJavaOptions  null
  supervise               false
  queue                   null
  numExecutors            null
  files                   null
  pyFiles                 null
  archives                null
  mainClass               com.my_user.MainClassWatch
  primaryResource         file:/home/my_user/cluster-testing/job.jar
  name                    orange_watch
  childArgs               [-watchId 3199 -appName orange_watch -directory /u01/watch/stream/]
  jars                    null
  packages                graphframes:graphframes:0.5.0-spark2.1-s_2.11
  packagesExclusions      null
  repositories            null
  verbose                 true
Spark properties used, including those specified through
 --conf and those from the properties file /home/my_user/spark-2.2.0-bin-hadoop2.7/conf/spark-defaults.conf:
  (spark.driver.memory,4g)
  (spark.executor.memory,6g)
  (spark.jars.packages,graphframes:graphframes:0.5.0-spark2.1-s_2.11)
  (spark.app.name<http://spark.app.name>,orange_watch)
  (spark.executor.cores,2)

Ivy Default Cache set to: /home/my_user/.ivy2/cache
The jars for the packages stored in: /home/my_user/.ivy2/jars
:: loading settings :: url = jar:file:/home/my_user/spark-2.2.0-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
graphframes#graphframes added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
        confs: [default]
        found graphframes#graphframes;0.5.0-spark2.1-s_2.11 in spark-list
        found com.typesafe.scala-logging#scala-logging-api_2.11;2.1.2 in central
        found com.typesafe.scala-logging#scala-logging-slf4j_2.11;2.1.2 in central
        found org.scala-lang#scala-reflect;2.11.0 in central
        found org.slf4j#slf4j-api;1.7.7 in spark-list
:: resolution report :: resolve 191ms :: artifacts dl 5ms
        :: modules in use:
        com.typesafe.scala-logging#scala-logging-api_2.11;2.1.2 from central in [default]
        com.typesafe.scala-logging#scala-logging-slf4j_2.11;2.1.2 from central in [default]
        graphframes#graphframes;0.5.0-spark2.1-s_2.11 from spark-list in [default]
        org.scala-lang#scala-reflect;2.11.0 from central in [default]
        org.slf4j#slf4j-api;1.7.7 from spark-list in [default]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   5   |   0   |   0   |   0   ||   5   |   0   |
        ---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent
        confs: [default]
        0 artifacts copied, 5 already retrieved (0kB/7ms)
Main class:
com.my_user.MainClassWatch
Arguments:
-watchId
3199
-appName
orange_watch
-directory
/u01/watch/stream/
System properties:
(spark.executor.memory,6g)
(spark.driver.memory,4g)
(SPARK_SUBMIT,true)
(spark.jars.packages,graphframes:graphframes:0.5.0-spark2.1-s_2.11)
(spark.app.name<http://spark.app.name>,orange_watch)
(spark.jars,file:/home/my_user/.ivy2/jars/graphframes_graphframes-0.5.0-spark2.1-s_2.11.jar,file:/home/my_user/.ivy2/jars/com.typesafe.scala-logging_scala-logging-api_2.11-2.1.2.jar,file:/home/my_user/.ivy2/jars/com.typesafe.scala-logging_scala-logging-slf4j_2.11-2.1.2.jar,file:/home/my_user/.ivy2/jars/org.scala-lang_scala-reflect-2.11.0.jar,file:/home/my_user/.ivy2/jars/org.slf4j_slf4j-api-1.7.7.jar,file:/home/my_user/cluster-testing/job.jar)
(spark.submit.deployMode,client)
(spark.master,spark://192.168.10.207:7077<http://192.168.10.207:7077>)
(spark.executor.cores,2)
Classpath elements:
file:/home/my_user/cluster-testing/job.jar
/home/my_user/.ivy2/jars/graphframes_graphframes-0.5.0-spark2.1-s_2.11.jar
/home/my_user/.ivy2/jars/com.typesafe.scala-logging_scala-logging-api_2.11-2.1.2.jar
/home/my_user/.ivy2/jars/com.typesafe.scala-logging_scala-logging-slf4j_2.11-2.1.2.jar
/home/my_user/.ivy2/jars/org.scala-lang_scala-reflect-2.11.0.jar
/home/my_user/.ivy2/jars/org.slf4j_slf4j-api-1.7.7.jar

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/12/06 12:36:50 INFO SparkContext: Running Spark version 2.2.0
17/12/06 12:36:50 INFO SparkContext: Submitted application: orange_watch
17/12/06 12:36:50 INFO SecurityManager: Changing view acls to: my_user
17/12/06 12:36:50 INFO SecurityManager: Changing modify acls to: my_user
17/12/06 12:36:50 INFO SecurityManager: Changing view acls groups to:
17/12/06 12:36:50 INFO SecurityManager: Changing modify acls groups to:
17/12/06 12:36:50 INFO SecurityManager: SecurityManager: authentication disabled; ui acls
disabled; users  with view permissions: Set(my_user); groups with view permissions: Set();
users  with modify permissions: Set(my_user); groups with modify permissions: Set()
17/12/06 12:36:50 INFO Utils: Successfully started service 'sparkDriver' on port 37329.
17/12/06 12:36:50 INFO SparkEnv: Registering MapOutputTracker
17/12/06 12:36:50 INFO SparkEnv: Registering BlockManagerMaster
17/12/06 12:36:50 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper
for getting topology information
17/12/06 12:36:50 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
17/12/06 12:36:50 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-f441dcc1-71c8-437e-ad7d-1057ab2b0f87
17/12/06 12:36:50 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB
17/12/06 12:36:50 INFO SparkEnv: Registering OutputCommitCoordinator
17/12/06 12:36:50 INFO Utils: Successfully started service 'SparkUI' on port 4040.
17/12/06 12:36:50 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.10.207:4040
17/12/06 12:36:50 INFO SparkContext: Added JAR file:/home/my_user/.ivy2/jars/graphframes_graphframes-0.5.0-spark2.1-s_2.11.jar
at spark://192.168.10.207:37329/jars/graphframes_graphframes-0.5.0-spark2.1-s_2.11.jar<http://192.168.10.207:37329/jars/graphframes_graphframes-0.5.0-spark2.1-s_2.11.jar>
with timestamp 1512545810823
17/12/06 12:36:50 INFO SparkContext: Added JAR file:/home/my_user/.ivy2/jars/com.typesafe.scala-logging_scala-logging-api_2.11-2.1.2.jar
at spark://192.168.10.207:37329/jars/com.typesafe.scala-logging_scala-logging-api_2.11-2.1.2.jar<http://192.168.10.207:37329/jars/com.typesafe.scala-logging_scala-logging-api_2.11-2.1.2.jar>
with timestamp 1512545810824
17/12/06 12:36:50 INFO SparkContext: Added JAR file:/home/my_user/.ivy2/jars/com.typesafe.scala-logging_scala-logging-slf4j_2.11-2.1.2.jar
at spark://192.168.10.207:37329/jars/com.typesafe.scala-logging_scala-logging-slf4j_2.11-2.1.2.jar<http://192.168.10.207:37329/jars/com.typesafe.scala-logging_scala-logging-slf4j_2.11-2.1.2.jar>
with timestamp 1512545810824
17/12/06 12:36:50 INFO SparkContext: Added JAR file:/home/my_user/.ivy2/jars/org.scala-lang_scala-reflect-2.11.0.jar
at spark://192.168.10.207:37329/jars/org.scala-lang_scala-reflect-2.11.0.jar<http://192.168.10.207:37329/jars/org.scala-lang_scala-reflect-2.11.0.jar>
with timestamp 1512545810824
17/12/06 12:36:50 INFO SparkContext: Added JAR file:/home/my_user/.ivy2/jars/org.slf4j_slf4j-api-1.7.7.jar
at spark://192.168.10.207:37329/jars/org.slf4j_slf4j-api-1.7.7.jar<http://192.168.10.207:37329/jars/org.slf4j_slf4j-api-1.7.7.jar>
with timestamp 1512545810824
17/12/06 12:36:50 INFO SparkContext: Added JAR file:/home/my_user/cluster-testing/job.jar
at spark://192.168.10.207:37329/jars/job.jar<http://192.168.10.207:37329/jars/job.jar>
with timestamp 1512545810824
17/12/06 12:36:50 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://192.168.10.207:7077...
17/12/06 12:36:50 INFO TransportClientFactory: Successfully created connection to /192.168.10.207:7077<http://192.168.10.207:7077>
after 16 ms (0 ms spent in bootstraps)
17/12/06 12:36:50 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 192.168.10.207:7077<http://192.168.10.207:7077>
org.apache.spark.SparkException: Exception thrown in awaitResult:
        at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
        at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
        at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:100)
        at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:108)
        at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: java.io.StreamCorruptedException: invalid stream header:
01000E31
        at java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:857)
        at java.io.ObjectInputStream.<init>(ObjectInputStream.java:349)
        at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:63)
        at org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:63)
        at org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:122)
        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:107)
        at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:259)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
        at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:308)
        at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:258)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
        at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:257)
        at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:577)
        at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:562)
        at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:159)
        at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:107)
        at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
        at java.lang.Thread.run(Thread.java:748)
        at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:207)
        at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:120)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
        ... 1 more
17/12/06 12:37:10 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://192.168.10.207:7077...
17/12/06 12:37:10 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 192.168.10.207:7077<http://192.168.10.207:7077>
org.apache.spark.SparkException: Exception thrown in awaitResult:
        at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
        at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
        at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:100)
        at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:108)
        at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: java.io.StreamCorruptedException: invalid stream header:
01000E31
        at java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:857)
        at java.io.ObjectInputStream.<init>(ObjectInputStream.java:349)
        at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:63)
        at org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:63)
        at org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:122)
        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:107)
        at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:259)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
        at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:308)
        at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:258)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
        at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:257)
        at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:577)
        at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:562)
        at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:159)
        at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:107)
        at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
        at java.lang.Thread.run(Thread.java:748)
        at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:207)
        at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:120)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
        ... 1 more
17/12/06 12:37:30 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://192.168.10.207:7077...
17/12/06 12:37:30 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 192.168.10.207:7077<http://192.168.10.207:7077>
org.apache.spark.SparkException: Exception thrown in awaitResult:
        at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
        at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
        at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:100)
        at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:108)
        at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: java.io.StreamCorruptedException: invalid stream header:
01000E31
        at java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:857)
        at java.io.ObjectInputStream.<init>(ObjectInputStream.java:349)
        at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:63)
        at org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:63)
        at org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:122)
        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:107)
        at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:259)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
        at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:308)
        at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:258)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
        at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:257)
        at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:577)
        at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:562)
        at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:159)
        at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:107)
        at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
        at java.lang.Thread.run(Thread.java:748)
        at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:207)
        at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:120)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
        at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
        ... 1 more
17/12/06 12:37:50 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All
masters are unresponsive! Giving up.
17/12/06 12:37:50 WARN StandaloneSchedulerBackend: Application ID is not initialized yet.
17/12/06 12:37:50 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService'
on port 42644.
17/12/06 12:37:50 INFO NettyBlockTransferService: Server created on 192.168.10.207:42644<http://192.168.10.207:42644>
17/12/06 12:37:50 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy
for block replication policy
17/12/06 12:37:50 INFO SparkUI: Stopped Spark web UI at http://192.168.10.207:4040
17/12/06 12:37:50 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver,
192.168.10.207, 42644, None)
17/12/06 12:37:50 INFO StandaloneSchedulerBackend: Shutting down all executors
17/12/06 12:37:50 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.10.207:42644<http://192.168.10.207:42644>
with 2004.6 MB RAM, BlockManagerId(driver, 192.168.10.207, 42644, None)
17/12/06 12:37:50 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor
to shut down
17/12/06 12:37:50 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver,
192.168.10.207, 42644, None)
17/12/06 12:37:50 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.207,
42644, None)
17/12/06 12:37:50 WARN StandaloneAppClient$ClientEndpoint: Drop UnregisterApplication(null)
because has not yet connected to master
17/12/06 12:37:50 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
17/12/06 12:37:50 INFO MemoryStore: MemoryStore cleared
17/12/06 12:37:50 INFO BlockManager: BlockManager stopped
17/12/06 12:37:50 INFO BlockManagerMaster: BlockManagerMaster stopped
17/12/06 12:37:50 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator
stopped!
17/12/06 12:37:50 ERROR SparkContext: Error initializing SparkContext.
java.lang.NullPointerException
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:567)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2509)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:909)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901)
        at com.my_user.MainClass.setCluster(MainClass.java:150)
        at com.my_user.MainClass.initSpark(MainClass.java:69)
        at com.my_user.MainClassWatch.main(MainClassWatch.java:25)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
Finished! Exit code:1
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/12/06 12:37:50 INFO SparkContext: SparkContext already stopped.
17/12/06 12:37:50 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" java.lang.NullPointerException
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:567)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2509)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:909)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901)
        at com.my_user.MainClass.setCluster(MainClass.java:150)
        at com.my_user.MainClass.initSpark(MainClass.java:69)
        at com.my_user.MainClassWatch.main(MainClassWatch.java:25)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/12/06 12:37:50 INFO ShutdownHookManager: Shutdown hook called
17/12/06 12:37:50 INFO ShutdownHookManager: Deleting directory /tmp/spark-f5329cbc-7de3-40ab-a4a2-9942e7b33815

Reverting back to 2.1.1 remove the issue

--
I.R
________________________________________________________

The information contained in this e-mail is confidential and/or proprietary to Capital One
and/or its affiliates and may only be used solely in performance of work or services for Capital
One. The information transmitted herewith is intended only for use by the individual or entity
to which it is addressed. If the reader of this message is not the intended recipient, you
are hereby notified that any review, retransmission, dissemination, distribution, copying
or other use of, or taking of any action in reliance upon this information is strictly prohibited.
If you have received this communication in error, please contact the sender and delete the
material from your computer.
Mime
View raw message