Egor Pahomov created SPARK-15409:
------------------------------------
Summary: Can't run spark odbc server on yarn the way I did it on 1.6.1
Key: SPARK-15409
URL: https://issues.apache.org/jira/browse/SPARK-15409
Project: Spark
Issue Type: Bug
Components: YARN
Affects Versions: 2.0.0
Reporter: Egor Pahomov
Priority: Minor
I'm getting error, while spark tries to run application in YARN -
{code}
16/05/19 10:33:20 INFO yarn.Client: Application report for application_1463075121059_12094
(state: ACCEPTED)
16/05/19 10:33:21 INFO yarn.Client: Application report for application_1463075121059_12094
(state: ACCEPTED)
16/05/19 10:33:22 INFO yarn.Client: Application report for application_1463075121059_12094
(state: ACCEPTED)
16/05/19 10:33:23 WARN server.TransportChannelHandler: Exception in connection from nod5-1-hadoop.anchorfree.net/192.168.12.128:56327
java.lang.NoSuchMethodError: java.util.concurrent.ConcurrentHashMap.keySet()Ljava/util/concurrent/ConcurrentHashMap$KeySetView;
at org.apache.spark.rpc.netty.Dispatcher.postToAll(Dispatcher.scala:107)
at org.apache.spark.rpc.netty.NettyRpcHandler.channelActive(NettyRpcEnv.scala:618)
at org.apache.spark.network.server.TransportRequestHandler.channelActive(TransportRequestHandler.java:86)
at org.apache.spark.network.server.TransportChannelHandler.channelActive(TransportChannelHandler.java:89)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelActive(AbstractChannelHandlerContext.java:183)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelActive(AbstractChannelHandlerContext.java:169)
at io.netty.channel.ChannelInboundHandlerAdapter.channelActive(ChannelInboundHandlerAdapter.java:64)
at io.netty.handler.timeout.IdleStateHandler.channelActive(IdleStateHandler.java:251)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelActive(AbstractChannelHandlerContext.java:183)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelActive(AbstractChannelHandlerContext.java:169)
at io.netty.channel.ChannelInboundHandlerAdapter.channelActive(ChannelInboundHandlerAdapter.java:64)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelActive(AbstractChannelHandlerContext.java:183)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelActive(AbstractChannelHandlerContext.java:169)
at io.netty.channel.ChannelInboundHandlerAdapter.channelActive(ChannelInboundHandlerAdapter.java:64)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelActive(AbstractChannelHandlerContext.java:183)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelActive(AbstractChannelHandlerContext.java:169)
at io.netty.channel.DefaultChannelPipeline.fireChannelActive(DefaultChannelPipeline.java:817)
at io.netty.channel.AbstractChannel$AbstractUnsafe.register0(AbstractChannel.java:454)
at io.netty.channel.AbstractChannel$AbstractUnsafe.access$100(AbstractChannel.java:378)
at io.netty.channel.AbstractChannel$AbstractUnsafe$1.run(AbstractChannel.java:424)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)
16/05/19 10:33:23 WARN channel.DefaultChannelPipeline: An exception was thrown by a user handler's
exceptionCaught() method while handling the following exception:
java.lang.NoSuchMethodError: java.util.concurrent.ConcurrentHashMap.keySet()Ljava/util/concurrent/ConcurrentHashMap$KeySetView;
at org.apache.spark.rpc.netty.Dispatcher.postToAll(Dispatcher.scala:107)
at org.apache.spark.rpc.netty.NettyRpcHandler.channelActive(NettyRpcEnv.scala:618)
at org.apache.spark.network.server.TransportRequestHandler.channelActive(TransportRequestHandler.java:86)
at org.apache.spark.network.server.TransportChannelHandler.channelActive(TransportChannelHandler.java:89)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelActive(AbstractChannelHandlerContext.java:183)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelActive(AbstractChannelHandlerContext.java:169)
at io.netty.channel.ChannelInboundHandlerAdapter.channelActive(ChannelInboundHandlerAdapter.java:64)
at io.netty.handler.timeout.IdleStateHandler.channelActive(IdleStateHandler.java:251)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelActive(AbstractChannelHandlerContext.java:183)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelActive(AbstractChannelHandlerContext.java:169)
at io.netty.channel.ChannelInboundHandlerAdapter.channelActive(ChannelInboundHandlerAdapter.java:64)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelActive(AbstractChannelHandlerContext.java:183)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelActive(AbstractChannelHandlerContext.java:169)
at io.netty.channel.ChannelInboundHandlerAdapter.channelActive(ChannelInboundHandlerAdapter.java:64)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelActive(AbstractChannelHandlerContext.java:183)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelActive(AbstractChannelHandlerContext.java:169)
at io.netty.channel.DefaultChannelPipeline.fireChannelActive(DefaultChannelPipeline.java:817)
at io.netty.channel.AbstractChannel$AbstractUnsafe.register0(AbstractChannel.java:454)
at io.netty.channel.AbstractChannel$AbstractUnsafe.access$100(AbstractChannel.java:378)
at io.netty.channel.AbstractChannel$AbstractUnsafe$1.run(AbstractChannel.java:424)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)
{code}
My config:
{code}
# Default system properties included when running spark-submit.
# This is useful for setting default environmental settings.
# Example:
# spark.master spark://master:7077
# spark.eventLog.enabled true
# spark.eventLog.dir hdfs://namenode:8021/directory
spark.serializer org.apache.spark.serializer.KryoSerializer
# spark.driver.memory 5g
# spark.executor.extraJavaOptions -XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three"
spark.master yarn-client
spark.app.name spark-odbc-server-vip
spark.yarn.queue spark.client.usual
spark.executor.memory 14g
spark.yarn.executor.memoryOverhead 2000
spark.executor.cores 4
spark.driver.memory 12g
spark.yarn.driver.memoryOverhead 3000
spark.sql.autoBroadcastJoinThreshold 200485760
spark.network.timeout 400s
spark.driver.maxResultSize 3g
spark.driver.cores 2
spark.yarn.am.cores 2
spark.akka.frameSize 500
spark.akka.askTimeout 300
spark.kryoserializer.buffer.max 1200m
spark.scheduler.mode FAIR
spark.sql.broadcastTimeout 20000
{code}
The way I start my my spark odbc server:
{code}
export HIVE_SERVER2_THRIFT_BIND_HOST=192.168.12.26; \
export HIVE_SERVER2_THRIFT_PORT=40504; \
export HADOOP_CONF_DIR=/home/egor/hadoop/conf; \
nohup ./sbin/start-thriftserver.sh;
{code}
The way I build spark
{code}
cd spark
export JAVA_HOME="/usr/lib/jvm/java-8-oracle"
export MAVEN_OPTS="-Xmx4g -XX:MaxPermSize=2g -XX:ReservedCodeCacheSize=2g"
build/mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver -DskipTests
clean package
{code}
The problem, which bothers me the most - same workflow worked fine on spark 1.6.1
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org
|