spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Takeshi Yamamuro <>
Subject Re: Using local-cluster mode for testing Spark-related projects
Date Sun, 17 Apr 2016 12:37:23 GMT
Is this a bad idea to create `SparkContext` with a `local-cluster` mode by
yourself like '

// maropu

On Sun, Apr 17, 2016 at 9:47 AM, Evan Chan <> wrote:

> Hey folks,
> I'd like to use local-cluster mode in my Spark-related projects to
> test Spark functionality in an automated way in a simulated local
> cluster.    The idea is to test multi-process things in a much easier
> fashion than setting up a real cluster.   However, getting this up and
> running in a separate project (I'm using Scala 2.10 and ScalaTest) is
> nontrivial.   Does anyone have any suggestions to get up and running?
> This is what I've observed so far (I'm testing against 1.5.1, but
> suspect this would apply equally to 1.6.x):
> - One needs to have a real Spark distro and point to it using SPARK_HOME
> - SPARK_SCALA_VERSION needs to be set
> - One needs to manually inject jar paths, otherwise dependencies are
> missing.  For example, build an assembly jar of all your deps.  Java
> class directory hierarchies don't seem to work with the setJars(...).
> How does Spark's internal scripts make it possible to run
> local-cluster mode and set up all the class paths correctly?   And, is
> it possible to mimic this setup for external Spark projects?
> thanks,
> Evan
> ---------------------------------------------------------------------
> To unsubscribe, e-mail:
> For additional commands, e-mail:

Takeshi Yamamuro

View raw message