spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From <em...@yeikel.com>
Subject What is the compatibility between releases?
Date Wed, 12 Jun 2019 04:25:01 GMT
Dear Community , 

 

>From what I understand , Spark uses a variation of Semantic Versioning[1] ,
but this information is not enough for me to clarify if it is compatible or
not within versions. 

 

For example , if my cluster is running Spark 2.3.1 , can I develop using API
additions in Spark 2.4? (higher order functions to give an  example). What
about the other way around? 

 

Typically , I assume that a job created in Spark 1.x will fail in Spark 2.x
, but that's also something I would like to get a confirmation. 

 

Thank you for your help!

 

[1] https://spark.apache.org/versioning-policy.html 


Mime
View raw message