spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shay Seng <s...@1618labs.com>
Subject Spark unit test question
Date Mon, 21 Oct 2013 17:30:46 GMT
I'm trying to write a unit test to ensure that some functions I rely on
will always serialize and run correctly on a cluster.
In one of these functions I've deliberately added a "val x:Int = 1" which
should prevent this method from being serializable right?

In the test I've done:
   sc = new SparkContext("local[2]","test")
   ...
   val pdata = sc.parallelize(data)
   val c = pdata.map().collect()

The unit tests still complete with no errors; I'm guessing because spark
knows that local[2] doesn't require serialization? Is there someway I can
force spark to run like it would do on a real cluster?


tks
shay

Mime
View raw message