spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gabor Somogyi <gabor.g.somo...@gmail.com>
Subject Re: How to disable 'spark.security.credentials.${service}.enabled' in Structured streaming while connecting to a kafka cluster
Date Fri, 10 Jan 2020 11:00:54 GMT
Hi,

Please open a jira + attach the spark application with configuration and
logs you have and just pull me in.
I'm going to check it...

All in all this happens when "sasl.jaas.config" is set on a consumer.
Presume somehow Spark obtained a token and set the mentioned property.

BR,
G


On Wed, Jan 8, 2020 at 3:37 PM act_coder <accthon@gmail.com> wrote:

> I am trying to read data from a secured Kafka cluster using spark
> structured
> streaming. Also I am using the below library to read the data -
> "spark-sql-kafka-0-10_2.12":"3.0.0-preview" since it has the feature to
> specify our custom group id (instead of spark setting its own custom group
> id)
>
> Dependency used in code:
>
>         <groupId>org.apache.spark</groupId>
>         <artifactId>spark-sql-kafka-0-10_2.12</artifactId>
>         <version>3.0.0-preview</version>
>
> I am getting the below error - even after specifying the required JAAS
> configuration in spark options.
>
> Caused by: java.lang.IllegalArgumentException: requirement failed:
> Delegation token must exist for this connector. at
> scala.Predef$.require(Predef.scala:281) at
>
>
> org.apache.spark.kafka010.KafkaTokenUtil$.isConnectorUsingCurrentToken(KafkaTokenUtil.scala:299)
> at
>
> org.apache.spark.sql.kafka010.KafkaDataConsumer.getOrRetrieveConsumer(KafkaDataConsumer.scala:533)
> at
>
> org.apache.spark.sql.kafka010.KafkaDataConsumer.$anonfun$get$1(KafkaDataConsumer.scala:275)
>
>
> Following document specifies that we can disable the feature of obtaining
> delegation token -
>
> https://spark.apache.org/docs/3.0.0-preview/structured-streaming-kafka-integration.html
>
> I tried setting this property spark.security.credentials.kafka.enabled to
> false in spark config, but it is still failing with the same error.
>
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>

Mime
View raw message