[ https://issues.apache.org/jira/browse/SPARK-25983?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean R. Owen resolved SPARK-25983.
----------------------------------
Resolution: Duplicate
We long since updated the client, which is now at 2.3.0. It'll be a duplicate of something,
therefore.
> spark-sql-kafka-0-10 no longer works with Kafka 0.10.0
> ------------------------------------------------------
>
> Key: SPARK-25983
> URL: https://issues.apache.org/jira/browse/SPARK-25983
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.4.0
> Reporter: Alexander Bessonov
> Priority: Minor
>
> Package {{org.apache.spark:spark-sql-kafka-0-10_2.11:2.4.0}} is no longer compatible
with {{org.apache.kafka:kafka_2.11:0.10.0.1}}.
> When both packages are used in the same project, the following exception occurs:
> {code:java}
> java.lang.NoClassDefFoundError: org/apache/kafka/common/protocol/SecurityProtocol
> at kafka.server.Defaults$.<init>(KafkaConfig.scala:125)
> at kafka.server.Defaults$.<clinit>(KafkaConfig.scala)
> at kafka.log.Defaults$.<init>(LogConfig.scala:33)
> at kafka.log.Defaults$.<clinit>(LogConfig.scala)
> at kafka.log.LogConfig$.<init>(LogConfig.scala:152)
> at kafka.log.LogConfig$.<clinit>(LogConfig.scala)
> at kafka.server.KafkaConfig$.<init>(KafkaConfig.scala:265)
> at kafka.server.KafkaConfig$.<clinit>(KafkaConfig.scala)
> at kafka.server.KafkaConfig.<init>(KafkaConfig.scala:759)
> at kafka.server.KafkaConfig.<init>(KafkaConfig.scala:761)
> {code}
>
> This exception is caused by incompatible dependency pulled by Spark: {{org.apache.kafka:kafka-clients_2.11:2.0.0}}.
>
> Following workaround could be used to resolve the problem in my project:
> {code:java}
> dependencyOverrides += "org.apache.kafka" % "kafka-clients" % "0.10.0.1"
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org
|