spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "M. Le Bihan (JIRA)" <j...@apache.org>
Subject [jira] [Comment Edited] (SPARK-24417) Build and Run Spark on JDK11
Date Fri, 22 Feb 2019 09:59:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-24417?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16774954#comment-16774954
] 

M. Le Bihan edited comment on SPARK-24417 at 2/22/19 9:58 AM:
--------------------------------------------------------------

It becomes really troublesome to see Java 12 coming in few weeks while _Spark_ that is somewhat
an impressive development in term of technology is hold on a JVM of year 2014. I have three
questions, please :

1) What version of Spark will become compatible with Java 11 ? 2.4.1, 2.4.2 or 3.0.0 ?

2) If Java 11 compatibility is postponed to Spark 3.0.0, when Spark 3.0.0 is planned to
be released ?

3) Will Spark become fully compatible with standard, classical, normal Java then, or will
it keep some kind of system programming that might keep him in jeopardy ? In one word : will he
suffer the same troubles when attempting to run with Java 12, 13, 14 ?

 

Since the coming of Java 9, now Java 11, and at the door of Java 12, 18 months have passed.
Can we have a date where Java 11 (and Java 12) compatibility will be available please ?

 


was (Author: mlebihan):
It becomes really troublesome to see Java 12 coming in few weeks while _Spark_ that is somewhat
an impressive development in term of technology is hold on a JVM of year 2014. I have three
questions, please :

1) What version of Spark will become compatible with Java 11 ? 2.4.1, 2.4.2 or 3.0.0 ?

2) If Java 11 compatibility is postponed to Spark 3.0.0, when Spark 3.0.0 is planned to
be released ?

3) Will Spark become fully compatible with standard, classical, normal Java then, or will
it keep some kind of system programming that might keep him in jeopardy ? In one word : will he
suffer the same troubles when attempting to run with Java 12, 13, 14 ?

 

Since the coming of Java 9, now Java 11, and at the door of Java 12, 18 months have passed.
Can we have a date for Java 11 (and Java 12) compatibility will be available please ?

 

> Build and Run Spark on JDK11
> ----------------------------
>
>                 Key: SPARK-24417
>                 URL: https://issues.apache.org/jira/browse/SPARK-24417
>             Project: Spark
>          Issue Type: New Feature
>          Components: Build
>    Affects Versions: 2.3.0
>            Reporter: DB Tsai
>            Priority: Major
>
> This is an umbrella JIRA for Apache Spark to support JDK11
> As JDK8 is reaching EOL, and JDK9 and 10 are already end of life, per community discussion,
we will skip JDK9 and 10 to support JDK 11 directly.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message