spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Kelly, Jonathan" <>
Subject Re: Spark v1.2.1 failing under BigTop build in External Flume Sink (due to missing Netty library)
Date Thu, 05 Mar 2015 21:34:44 GMT
That's probably a good thing to have, so I'll add it, but unfortunately it
did not help this issue.  It looks like the hadoop-2.4 profile only sets
these properties, which don't seem like they would affect anything related
to Netty:


Jonathan Kelly

On 3/5/15, 1:09 PM, "Patrick Wendell" <> wrote:

>You may need to add the -Phadoop-2.4 profile. When building or release
>packages for Hadoop 2.4 we use the following flags:
>-Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn
>- Patrick
>On Thu, Mar 5, 2015 at 12:47 PM, Kelly, Jonathan <>
>> I confirmed that this has nothing to do with BigTop by running the same
>> command directly in a fresh clone of the Spark package at the v1.2.1
>>tag.  I
>> got the same exact error.
>> Jonathan Kelly
>> Elastic MapReduce - SDE
>> Port 99 (SEA35) 08.220.C2
>> From: <Kelly>, Jonathan Kelly <>
>> Date: Thursday, March 5, 2015 at 10:39 AM
>> To: "" <>
>> Subject: Spark v1.2.1 failing under BigTop build in External Flume Sink
>> to missing Netty library)
>> I'm running into an issue building Spark v1.2.1 (as well as the latest
>> branch-1.2 and v1.3.0-rc2 and the latest in branch-1.3) with BigTop
>> which is not quite released yet).  The build fails in the External Flume
>> Sink subproject with the following error:
>> [INFO] Compiling 5 Scala sources and 3 Java sources to
>> [WARNING] Class not found -
>> continuing with a stub.
>> [ERROR] error while loading NettyServer, class file
>> is broken
>> (class java.lang.NullPointerException/null)
>> [WARNING] one warning found
>> [ERROR] one error found
>> It seems like what is happening is that the Netty library is missing at
>> build time, which happens because it is explicitly excluded in the
>> (see
>> I attempted removing the exclusions and the explicit re-add for the test
>> scope on lines 77-88, and that allowed the build to succeed, though I
>> know if that will cause problems at runtime.  I don't have any
>> with the Flume Sink, so I don't really know how to test it.  (And, to be
>> clear, I'm not necessarily trying to get the Flume Sink to work-- I just
>> want the project to build successfully, though of course I'd still want
>> Flume Sink to work for whomever does need it.)
>> Does anybody have any idea what's going on here?  Here is the command
>> is running to build Spark:
>> mvn -Pbigtop-dist -Pyarn -Phive -Phive-thriftserver -Pkinesis-asl
>> -Divy.home=/home/ec2-user/.ivy2 -Dsbt.ivy.home=/home/ec2-user/.ivy2
>> -Duser.home=/home/ec2-user
>> -Dreactor.repo=file:///home/ec2-user/.m2/repository
>> -Dhadoop.version=2.4.0-amzn-3-SNAPSHOT
>> -Dprotobuf.version=2.5.0 -Dscala.version=2.10.3
>> -DskipTests -DrecompileMode=all install
>> As I mentioned above, if I switch to the latest in branch-1.2, to
>> v1.3.0-rc2, or to the latest in branch-1.3, I get the same exact error.
>> I
>> was not getting the error with Spark v1.1.0, though there weren't any
>> changes to the external/flume-sink/pom.xml between v1.1.0 and v1.2.1.
>> ~ Jonathan Kelly

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message