spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matei Zaharia <>
Subject Re: Building spark with native library support
Date Thu, 06 Mar 2014 17:44:15 GMT
Is it an error, or just a warning? In any case, you need to get those libraries from a build
of Hadoop for your platform. Then add them to the SPARK_LIBRARY_PATH environment variable
in conf/, or to your -Djava.library.path if launching an application separately.

These libraries just speed up some compression codecs BTW, so it should be fine to run without
them too.


On Mar 6, 2014, at 9:04 AM, Alan Burlison <> wrote:

> Hi,
> I've successfully built 0.9.0-incubating on Solaris using sbt, following the instructions
at and it seems to work OK. However, when I
start it up I get an error about missing Hadoop native libraries. I can't find any mention
of how to build the native components in the instructions, how is that done?
> Thanks,
> -- 
> Alan Burlison
> --

View raw message