spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jon Maurer <tri...@gmail.com>
Subject Re: Using local-cluster mode for testing Spark-related projects
Date Sun, 17 Apr 2016 16:51:01 GMT
Take a look at spark testing base.
https://github.com/holdenk/spark-testing-base/blob/master/README.md
On Apr 17, 2016 10:28 AM, "Evan Chan" <velvia.github@gmail.com> wrote:

> What I want to find out is how to run tests like Spark's with
> local-cluster, just like that suite, but in your own projects.   Has
> anyone done this?
>
> On Sun, Apr 17, 2016 at 5:37 AM, Takeshi Yamamuro <linguin.m.s@gmail.com>
> wrote:
> > Hi,
> > Is this a bad idea to create `SparkContext` with a `local-cluster` mode
> by
> > yourself like
> > '
> https://github.com/apache/spark/blob/master/core/src/test/scala/org/apache/spark/ShuffleSuite.scala#L55
> '?
> >
> > // maropu
> >
> > On Sun, Apr 17, 2016 at 9:47 AM, Evan Chan <velvia.github@gmail.com>
> wrote:
> >>
> >> Hey folks,
> >>
> >> I'd like to use local-cluster mode in my Spark-related projects to
> >> test Spark functionality in an automated way in a simulated local
> >> cluster.    The idea is to test multi-process things in a much easier
> >> fashion than setting up a real cluster.   However, getting this up and
> >> running in a separate project (I'm using Scala 2.10 and ScalaTest) is
> >> nontrivial.   Does anyone have any suggestions to get up and running?
> >>
> >> This is what I've observed so far (I'm testing against 1.5.1, but
> >> suspect this would apply equally to 1.6.x):
> >>
> >> - One needs to have a real Spark distro and point to it using SPARK_HOME
> >> - SPARK_SCALA_VERSION needs to be set
> >> - One needs to manually inject jar paths, otherwise dependencies are
> >> missing.  For example, build an assembly jar of all your deps.  Java
> >> class directory hierarchies don't seem to work with the setJars(...).
> >>
> >> How does Spark's internal scripts make it possible to run
> >> local-cluster mode and set up all the class paths correctly?   And, is
> >> it possible to mimic this setup for external Spark projects?
> >>
> >> thanks,
> >> Evan
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> >> For additional commands, e-mail: dev-help@spark.apache.org
> >>
> >
> >
> >
> > --
> > ---
> > Takeshi Yamamuro
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>

Mime
View raw message