spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hyukjin Kwon (Jira)" <j...@apache.org>
Subject [jira] [Resolved] (SPARK-25057) Unable to start spark on master URL
Date Tue, 08 Oct 2019 05:42:13 GMT

     [ https://issues.apache.org/jira/browse/SPARK-25057?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Hyukjin Kwon resolved SPARK-25057.
----------------------------------
    Resolution: Incomplete

> Unable to start spark on master URL
> -----------------------------------
>
>                 Key: SPARK-25057
>                 URL: https://issues.apache.org/jira/browse/SPARK-25057
>             Project: Spark
>          Issue Type: Question
>          Components: Java API
>    Affects Versions: 2.2.2
>         Environment: Spring-boot, Spark 2.2.2, Cassandra 3.5.1
>            Reporter: Shivam Gupta
>            Priority: Major
>              Labels: bulk-closed
>
> I am building a REST microservice with spark and Cassandra and I provided spark master
value as local and its running fine.
> But when I tried to provide spark master URL as " spark://ip:7077 " then it's showing
the following error when I start the rest service:
> {code:java}
> Caused by: java.io.IOException: Failed to send RPC 8950209836630764258 to /98.8.150.125:7077:
java.lang.AbstractMethodError: org.apache.spark.network.protocol.MessageWithHeader.touch(Ljava/lang/Object;)Lio/netty/util/ReferenceCounted;
> at org.apache.spark.network.client.TransportClient.lambda$sendRpc$2(TransportClient.java:237)
~[spark-network-common_2.11-2.2.2.jar!/:2.2.2]
> at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:500)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:479)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:122) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.util.internal.PromiseNotificationUtil.tryFailure(PromiseNotificationUtil.java:64)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.channel.AbstractChannelHandlerContext.notifyOutboundHandlerException(AbstractChannelHandlerContext.java:837)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:740)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:816)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:305) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:730)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.channel.AbstractChannelHandlerContext.access$1900(AbstractChannelHandlerContext.java:38)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.write(AbstractChannelHandlerContext.java:1089)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1136)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1078)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:462) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> ... 1 common frames omitted
> Caused by: java.lang.AbstractMethodError: org.apache.spark.network.protocol.MessageWithHeader.touch(Ljava/lang/Object;)Lio/netty/util/ReferenceCounted;
> at io.netty.util.ReferenceCountUtil.touch(ReferenceCountUtil.java:73) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.channel.DefaultChannelPipeline.touch(DefaultChannelPipeline.java:107) ~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:810)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:723)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:111)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:738)
~[netty-all-4.1.10.Final.jar!/:4.1.10.Final]
> ... 16 common frames omitted
> {code}
> I am using following spark and cassandra dependencies for my rest service:
> {code:java}
> <dependency>
> <groupId>org.apache.spark</groupId>
> <artifactId>spark-core_2.11</artifactId>
> <version>${spark.version}</version>
> </dependency>
> <dependency>
> <groupId>org.apache.spark</groupId>
> <artifactId>spark-sql_2.11</artifactId>
> <version>${spark.version}</version>
> </dependency>
> <dependency>
> <groupId>com.datastax.spark</groupId>
> <artifactId>spark-cassandra-connector_2.11</artifactId>
> <version>${spark.cassandra.connector.version}</version>
> </dependency>
> <dependency>
> <groupId>com.datastax.spark</groupId>
> <artifactId>spark-cassandra-connector-java_2.11</artifactId>
> <version>${spark.cassandra.connector.java.version}</version>
> </dependency>{code}
> I also tried to provide spark master URL in spark-env.sh in spark conf, but its no use.
Has anyone faced similar issue before? Any help is appreciated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message