spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From jay vyas <jayunit100.apa...@gmail.com>
Subject Re: Unit Testing
Date Thu, 13 Aug 2015 12:51:40 GMT
yes there certainly is, so long as eclipse has the right plugins and so on
to run scala programs.  You're really asking two questions: (1) Can I use a
modern IDE to develop spark apps and (2) can we easily  unit test spark
streaming apps.

the answer is yes to both...

Regarding your IDE:

I like to use intellij with the set plugins for scala development.  It
allows you to run everything from inside the IDE.  I've written up setup
instructions here:
http://jayunit100.blogspot.com/2014/07/set-up-spark-application-devleopment.html

Now, regarding local unit testing:

As an example, here is a unit test for confirming that spark can write to
cassandra.

https://github.com/jayunit100/SparkStreamingApps/blob/master/src/test/scala/TestTwitterCassandraETL.scala

The key here is to just set your local master in the unit test, like so

sc.setMaster("local[2]")

local[2] gaurantees that you'll have a producer and a consumer, so that you
don't get a starvation scenario.


On Wed, Aug 12, 2015 at 7:31 PM, Mohit Anchlia <mohitanchlia@gmail.com>
wrote:

> Is there a way to run spark streaming methods in standalone eclipse
> environment to test out the functionality?
>



-- 
jay vyas

Mime
View raw message