spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Adam Roberts (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-19552) Upgrade Netty version to 4.1.8 final
Date Mon, 13 Feb 2017 11:37:42 GMT

     [ https://issues.apache.org/jira/browse/SPARK-19552?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Adam Roberts updated SPARK-19552:
---------------------------------
    Description: 
Netty 4.1.8 was recently released but isn't API compatible with previous major versions (like
Netty 4.0.x), see http://netty.io/news/2017/01/30/4-0-44-Final-4-1-8-Final.html for details.

This version does include a fix for a security concern but not one we'd be exposed to with
Spark "out of the box". Let's upgrade the version we use to be on the safe side as the security
fix I'm especially interested in is not available in the 4.0.x release line. 

We should move up anyway to take on a bunch of other big fixes cited in the release notes
(and if anyone were to use Spark with netty and tcnative, they shouldn't be exposed to the
security problem) - we should be good citizens and make this change.

As this 4.1 version involves API changes we'll need to implement a few methods and possibly
adjust the Sasl tests. I'd also like to know the purpose of the additional netty (without
"all" in the artifact name) in our pom that's at version 3.9.9.

This JIRA and associated pull request starts the process which I'll work on - and any help
would be much appreciated! Currently I know:

{code}
@Override
public void write(ChannelHandlerContext ctx, Object msg, ChannelPromise promise)
      throws Exception {
      if (!foundEncryptionHandler) {
        foundEncryptionHandler =
          ctx.channel().pipeline().get(encryptHandlerName) != null; <-- this returns false
and causes test failures
      }
      ctx.write(msg, promise);
    }
{code}


Here's what changes will be required (at least):

{code}
common/network-common/src/main/java/org/apache/spark/network/crypto/TransportCipher.java{code}
requires touch, retain and transferred methods

{code}
common/network-common/src/main/java/org/apache/spark/network/sasl/SaslEncryption.java{code}
requires the above methods too

{code}common/network-common/src/test/java/org/apache/spark/network/protocol/MessageWithHeaderSuite.java{code}

With "dummy" implementations so we can at least compile and test, we'll see five new test
failures to address.

These are
{code}
org.apache.spark.network.sasl.SparkSaslSuite.testFileRegionEncryption
org.apache.spark.network.sasl.SparkSaslSuite.testSaslEncryption
org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testEncryption
org.apache.spark.rpc.netty.NettyRpcEnvSuite.send with SASL encryption
org.apache.spark.rpc.netty.NettyRpcEnvSuite.ask with SASL encryption
{code}

  was:
Netty 4.1.8 was recently released but isn't API compatible with previous major versions (like
Netty 4.0.x), see http://netty.io/news/2017/01/30/4-0-44-Final-4-1-8-Final.html for details.

This version does include a fix for a security concern - I don't know if Spark can be used
as an attack vector so let's upgrade the version we use to be on the safe side. The security
fix I'm especially interested in is not available in the 4.0.x release line.

As this 4.1 version involves API changes we'll need to implement a few methods and possibly
adjust the Sasl tests. I'd also like to know the purpose of the additional netty (without
"all" in the artifact name) in our pom that's at version 3.9.9.

This JIRA and associated pull request starts the process which I'll work on - and any help
would be much appreciated! Currently I know:

{code}
@Override
public void write(ChannelHandlerContext ctx, Object msg, ChannelPromise promise)
      throws Exception {
      if (!foundEncryptionHandler) {
        foundEncryptionHandler =
          ctx.channel().pipeline().get(encryptHandlerName) != null; <-- this returns false
and causes test failures
      }
      ctx.write(msg, promise);
    }
{code}


Here's what changes will be required (at least):

{code}
common/network-common/src/main/java/org/apache/spark/network/crypto/TransportCipher.java{code}
requires touch, retain and transferred methods

{code}
common/network-common/src/main/java/org/apache/spark/network/sasl/SaslEncryption.java{code}
requires the above methods too

{code}common/network-common/src/test/java/org/apache/spark/network/protocol/MessageWithHeaderSuite.java{code}

With "dummy" implementations so we can at least compile and test, we'll see five new test
failures to address.

These are
{code}
org.apache.spark.network.sasl.SparkSaslSuite.testFileRegionEncryption
org.apache.spark.network.sasl.SparkSaslSuite.testSaslEncryption
org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testEncryption
org.apache.spark.rpc.netty.NettyRpcEnvSuite.send with SASL encryption
org.apache.spark.rpc.netty.NettyRpcEnvSuite.ask with SASL encryption
{code}


> Upgrade Netty version to 4.1.8 final
> ------------------------------------
>
>                 Key: SPARK-19552
>                 URL: https://issues.apache.org/jira/browse/SPARK-19552
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build
>    Affects Versions: 2.1.0
>            Reporter: Adam Roberts
>            Priority: Minor
>
> Netty 4.1.8 was recently released but isn't API compatible with previous major versions
(like Netty 4.0.x), see http://netty.io/news/2017/01/30/4-0-44-Final-4-1-8-Final.html for
details.
> This version does include a fix for a security concern but not one we'd be exposed to
with Spark "out of the box". Let's upgrade the version we use to be on the safe side as the
security fix I'm especially interested in is not available in the 4.0.x release line. 
> We should move up anyway to take on a bunch of other big fixes cited in the release notes
(and if anyone were to use Spark with netty and tcnative, they shouldn't be exposed to the
security problem) - we should be good citizens and make this change.
> As this 4.1 version involves API changes we'll need to implement a few methods and possibly
adjust the Sasl tests. I'd also like to know the purpose of the additional netty (without
"all" in the artifact name) in our pom that's at version 3.9.9.
> This JIRA and associated pull request starts the process which I'll work on - and any
help would be much appreciated! Currently I know:
> {code}
> @Override
> public void write(ChannelHandlerContext ctx, Object msg, ChannelPromise promise)
>       throws Exception {
>       if (!foundEncryptionHandler) {
>         foundEncryptionHandler =
>           ctx.channel().pipeline().get(encryptHandlerName) != null; <-- this returns
false and causes test failures
>       }
>       ctx.write(msg, promise);
>     }
> {code}
> Here's what changes will be required (at least):
> {code}
> common/network-common/src/main/java/org/apache/spark/network/crypto/TransportCipher.java{code}
requires touch, retain and transferred methods
> {code}
> common/network-common/src/main/java/org/apache/spark/network/sasl/SaslEncryption.java{code}
requires the above methods too
> {code}common/network-common/src/test/java/org/apache/spark/network/protocol/MessageWithHeaderSuite.java{code}
> With "dummy" implementations so we can at least compile and test, we'll see five new
test failures to address.
> These are
> {code}
> org.apache.spark.network.sasl.SparkSaslSuite.testFileRegionEncryption
> org.apache.spark.network.sasl.SparkSaslSuite.testSaslEncryption
> org.apache.spark.network.shuffle.ExternalShuffleSecuritySuite.testEncryption
> org.apache.spark.rpc.netty.NettyRpcEnvSuite.send with SASL encryption
> org.apache.spark.rpc.netty.NettyRpcEnvSuite.ask with SASL encryption
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message