kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hugo Reinwald <hugo.reinw...@gmail.com>
Subject Spark Executor - jaas.conf with useTicketCache=true
Date Tue, 19 Sep 2017 06:45:06 GMT
Hi All,

I am connecting to a secured kafka cluster from spark. My jaas.conf looks
like below -
KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useTicketCache=true
keyTab="./user.keytab"
principal="user@EXAMPLE.COM";
};

export KAFKA_OPTS="-Djava.security.auth.login.config=/home/user/jaas.conf"

I tested connectivity using kafka-console-consumer and I am able to read
data from kafka topic. However when I used the same in spark-submit using
the below options, I get a kerberos error -

spark-sbumit .... --files jaas.conf#jaas.conf --driver-java-options "-Djava
.security.auth.login.config=./jaas.conf" --conf "spark.executor.
extraJavaOptions=-Djava.security.auth.login.config=./jaas.conf" ....
*Could not login: the client is being asked for a password, but the Kafka
client code does not currently support obtaining a password from the user.
not available to garner  authentication information from the user*

My question - Can we not use the spark executor ticket cache (spark running
the job as "user" )? Do we always need to provide the keytab file also
using --files? I also tested using --principal user@EXAMPLE.COM --keytab
<file>, but still got the same error. Is there any way that I can use the
ticketcache from spark  executor for kafka?

PS - I read this link - https://docs.confluent.io/2.0.0/kafka/sasl.html#
kerberos which says that *"For command-line utilities like
kafka-console-consumer or kafka-console-producer, kinit can be used along
with useTicketCache=true "*

Not sure if this is as per design or am I missing something.

Thanks,
Hugo

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message