spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gerard Maas <gerard.m...@gmail.com>
Subject Re: is Mesos falling out of favor?
Date Thu, 15 May 2014 20:14:00 GMT
By looking at your config, I think there's something wrong with your setup.
One of the key elements of Mesos is that you are abstracted from where the
execution of your task takes place. The SPARK_EXECUTOR_URI tells Mesos
where to find the 'framework' (in Mesos jargon) required to execute a job.
 (Actually, it tells the spark driver  to tell mesos where to download the
framework)
Your config looks like you are running some mix of Spark Cluster with
Mesos.

This is an example of a Spark job to run on Mesos:

Driver:

ADD_JARS=/.../job-jar-with-dependencies.jar SPARK_LOCAL_IP=<IP> java -cp
/.../spark-assembly.jar:/.../job-jar-with-dependencies.jar
-Dconfig.file=job-config.conf com.example.jobs.SparkJob

Config: job-config.conf contains this info on Mesos: (Note the Mesos URI is
constructed from this config
# ------------------------------------------------------------
# Mesos configuration
# ------------------------------------------------------------
mesos {
    zookeeper = {zookeeper.ip}
    executorUri  =
"hdfs://"${hdfs.nameNode.host}":"${hdfs.nameNode.port}"/spark/spark-0.9.0.1-bin.tar.gz"
    master       {
        host = {mesos-ip}
        port = 5050
    }
}

Probably this can still be improved as it's the result of some
trial-error-repeat, but it's working for us.

-greetz, Gerard



On Wed, May 7, 2014 at 7:43 PM, deric <barton.tomas@gmail.com> wrote:

> I'm running 1.0.0 branch, finally I've managed to make it work. I'm using a
> Debian package which is distributed on all slave nodes. So, I've removed
> `SPARK_EXECUTOR_URI` and it works,  spark-env.sh looks like this:
>
> export MESOS_NATIVE_LIBRARY="/usr/local/lib/libmesos.so"
> export SCALA_HOME="/usr"
> export SCALA_LIBRARY_PATH="/usr/share/java"
> export MASTER="mesos://zk://192.168.1.1:2181/mesos"
> export SPARK_HOME="/usr/share/spark"
> export SPARK_LOCAL_IP="192.168.1.2"
> export SPARK_PRINT_LAUNCH_COMMAND="1"
> export CLASSPATH=$CLASSPATH:$SPARK_HOME/lib/
>
> scripts for Debian package are here (I'll try to add some documentation):
> https://github.com/deric/spark-deb-packaging
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/is-Mesos-falling-out-of-favor-tp5444p5484.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Mime
View raw message