Please ignore this error - I found the issue.

Thanks !

On Mon, Jan 20, 2014 at 3:14 PM, Manoj Samel <> wrote:

I deployed spark 0.8.1 on standalone cluster per

When i start a spark-shell , I get following error

I thought mesos should not be required for standalone cluster. Do I have to change any parameters in that I used to build the spark distribution for this cluster ? I left all to default (and noticed that the default HADOOP version is 1.0.4 which is not my hadoop version - but I am not using Hadoop here).

Creating SparkContext...
Failed to load native Mesos library from
java.lang.UnsatisfiedLinkError: no mesos in java.library.path
at java.lang.ClassLoader.loadLibrary(
at java.lang.Runtime.loadLibrary0(
at java.lang.System.loadLibrary(
at org.apache.mesos.MesosNativeLibrary.load(
at org.apache.mesos.MesosNativeLibrary.load(
at org.apache.spark.SparkContext.<init>(SparkContext.scala:260)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:862)