spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Owen (JIRA)" <>
Subject [jira] [Commented] (SPARK-19552) Upgrade Netty version to 4.1.8 final
Date Fri, 10 Feb 2017 16:53:42 GMT


Sean Owen commented on SPARK-19552:

The question you should focus on before proceeding is what the implications of updating are
for users. Yes it requires Spark changes, and that change in Netty 4 leaks into the user classpath
by default I think. Are there behavior changes? we've had problems along this line in the

Yes the other JIRA answers about the existence of 3.9.x.

> Upgrade Netty version to 4.1.8 final
> ------------------------------------
>                 Key: SPARK-19552
>                 URL:
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build
>    Affects Versions: 2.1.0
>            Reporter: Adam Roberts
>            Priority: Minor
> Netty 4.1.8 was recently released but isn't API compatible with previous major versions
(like Netty 4.0.x), see for
> This version does include a fix for a security concern - I don't know if Spark can be
used as an attack vector so let's upgrade the version we use to be on the safe side. The security
fix I'm especially interested in is not available in the 4.0.x release line.
> As this 4.1 version involves API changes we'll need to implement a few methods and possibly
adjust the Sasl tests. I'd also like to know the purpose of the additional netty (without
"all" in the artifact name) in our pom that's at version 3.9.9.
> This JIRA and associated pull request starts the process which I'll work on - and any
help would be much appreciated! Currently I know:
> {code}
> @Override
> public void write(ChannelHandlerContext ctx, Object msg, ChannelPromise promise)
>       throws Exception {
>       if (!foundEncryptionHandler) {
>         foundEncryptionHandler =
>  != null; <-- this returns
false and causes test failures
>       }
>       ctx.write(msg, promise);
>     }
> {code}
> Here's what changes will be required (at least):
> {code}
> common/network-common/src/main/java/org/apache/spark/network/crypto/{code}
requires touch, retain and transferred methods
> {code}
> common/network-common/src/main/java/org/apache/spark/network/sasl/{code}
requires the above methods too
> {code}common/network-common/src/test/java/org/apache/spark/network/protocol/{code}
> With "dummy" implementations so we can at least compile and test, we'll see five new
test failures to address.
> These are
> {code}
> org.apache.spark.rpc.netty.NettyRpcEnvSuite.send with SASL encryption
> org.apache.spark.rpc.netty.NettyRpcEnvSuite.ask with SASL encryption
> {code}

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message