spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Dongjoon Hyun (Jira)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-29957) Reset MiniKDC's default enctypes to fit jdk8/jdk11
Date Fri, 06 Dec 2019 07:15:00 GMT

     [ https://issues.apache.org/jira/browse/SPARK-29957?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Dongjoon Hyun updated SPARK-29957:
----------------------------------
    Description: 
Since MiniKdc version lower than hadoop-3.0 can't work well in jdk11.
New encryption types of es128-cts-hmac-sha256-128 and aes256-cts-hmac-sha384-192 (for Kerberos
5) enabled by default were added in Java 11, while version of MiniKdc under 3.0.0 used by
Spark does not support these encryption types and does not work well when these encryption
types are enabled, which results in the authentication failure.

-----
Hadoop jira: https://issues.apache.org/jira/browse/HADOOP-12911
In this jira, the author said to replace origin Apache Directory project which is not maintained
(but not said it won't work well in jdk11) to Apache Kerby which is java binding(fit java
version).

And in Flink: apache/flink#9622
Author show the reason why hadoop-2.7.2's MminiKdc failed with jdk11.
Because new encryption types of es128-cts-hmac-sha256-128 and aes256-cts-hmac-sha384-192 (for
Kerberos 5) enabled by default were added in Java 11.
Spark with hadoop-2.7's MiniKdcdoes not support these encryption types and does not work well
when these encryption types are enabled, which results in the authentication failure.

And when I test hadoop-2.7.2's minikdc in local, the kerberos 's debug error message is read
message stream failed, message can't match.

  was:
Since MiniKdc version lower than hadoop-3.0 can't work well in jdk11.
New encryption types of es128-cts-hmac-sha256-128 and aes256-cts-hmac-sha384-192 (for Kerberos
5) enabled by default were added in Java 11, while version of MiniKdc under 3.0.0 used by
Spark does not support these encryption types and does not work well when these encryption
types are enabled, which results in the authentication failure.


> Reset MiniKDC's default enctypes to fit jdk8/jdk11
> --------------------------------------------------
>
>                 Key: SPARK-29957
>                 URL: https://issues.apache.org/jira/browse/SPARK-29957
>             Project: Spark
>          Issue Type: Improvement
>          Components: Tests
>    Affects Versions: 3.0.0
>            Reporter: angerszhu
>            Assignee: angerszhu
>            Priority: Major
>             Fix For: 3.0.0
>
>
> Since MiniKdc version lower than hadoop-3.0 can't work well in jdk11.
> New encryption types of es128-cts-hmac-sha256-128 and aes256-cts-hmac-sha384-192 (for
Kerberos 5) enabled by default were added in Java 11, while version of MiniKdc under 3.0.0
used by Spark does not support these encryption types and does not work well when these encryption
types are enabled, which results in the authentication failure.
> -----
> Hadoop jira: https://issues.apache.org/jira/browse/HADOOP-12911
> In this jira, the author said to replace origin Apache Directory project which is not
maintained (but not said it won't work well in jdk11) to Apache Kerby which is java binding(fit
java version).
> And in Flink: apache/flink#9622
> Author show the reason why hadoop-2.7.2's MminiKdc failed with jdk11.
> Because new encryption types of es128-cts-hmac-sha256-128 and aes256-cts-hmac-sha384-192
(for Kerberos 5) enabled by default were added in Java 11.
> Spark with hadoop-2.7's MiniKdcdoes not support these encryption types and does not work
well when these encryption types are enabled, which results in the authentication failure.
> And when I test hadoop-2.7.2's minikdc in local, the kerberos 's debug error message
is read message stream failed, message can't match.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message