spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matthew Cornell <matthewcorn...@gmail.com>
Subject getting started writing unit tests for my app
Date Tue, 20 Jan 2015 14:26:57 GMT
Hi Folks,

I'm writing a GraphX app and I need to do some test-driven development.
I've got Spark running on our little cluster and have built and run some
hello world apps, so that's all good.

I've looked through the test source and found lots of helpful examples that
use SharedSparkContext, and to a greater extent, LocalSparkContext, but I
don't have those classes available to my app, which has a library set up to
use the lib/spark-assembly-1.2.0-hadoop2.0.0-mr1-cdh4.2.0.jar that came
with the compiled version I downloaded. (I don't plan on using Spark with
our CDH4 Hadoop, but I picked that download anyway.)

So I've downloaded the source code, found LocalSparkContext (two actually -
one each in core/ and graphx/ ), but now I'm stuck figuring out how to
build a jar (say lib/1.2.0-hadoop2.0.0-mr1-cdh4.2.0-TESTS.jar ) that I can
include in my tests' library. I'm not very familiar with Maven or SBT, so
doing the build is frankly daunting. I could really use your advice here!

Thank you,

matt

Mime
View raw message