spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Adrian Mocanu <amoc...@verticalscope.com>
Subject RE: tests that run locally fail when run through bamboo
Date Wed, 21 May 2014 21:31:55 GMT
Just found this at the top of the log:

17:14:41.124 [pool-7-thread-3-ScalaTest-running-StreamingSpikeSpec] WARN  o.e.j.u.component.AbstractLifeCycle
- FAILED SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already in use
build   21-May-2014 17:14:41   java.net.BindException: Address already in use

Is there a way to set these connection up so that they don't all start on the same port (that's
my guess for the root cause of the issue)

From: Adrian Mocanu [mailto:amocanu@verticalscope.com]
Sent: May-21-14 4:58 PM
To: user@spark.incubator.apache.org; user@spark.apache.org
Subject: tests that run locally fail when run through bamboo

I have a few test cases for Spark which extend TestSuiteBase from org.apache.spark.streaming.
The tests run fine on my machine but when I commit to repo and run the tests automatically
with bamboo the test cases fail with these errors.

How to fix?


21-May-2014 16:33:09

[info] StreamingZigZagSpec:

21-May-2014 16:33:09

[info] - compute zigzag indicator in stream *** FAILED ***

21-May-2014 16:33:09

[info]   org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 1 times (most recent
failure: Exception failure: java.io.StreamCorruptedException: invalid type code: AC)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1020)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at scala.Option.foreach(Option.scala:236)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:190)

21-May-2014 16:33:09

[info]   ...

21-May-2014 16:33:09

[info] - compute zigzag indicator in stream with intermittent empty RDDs *** FAILED ***

21-May-2014 16:33:09

[info]   Operation timed out after 10042 ms (TestSuiteBase.scala:283)

21-May-2014 16:33:09

[info] - compute zigzag indicator in stream with 3 empty RDDs *** FAILED ***

21-May-2014 16:33:09

[info]   org.apache.spark.SparkException: Job aborted: Task 1.0:1 failed 1 times (most recent
failure: Exception failure: java.io.FileNotFoundException: /tmp/spark-local-20140521163241-1707/0f/shuffle_1_1_1
(No such file or directory))

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1020)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at scala.Option.foreach(Option.scala:236)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:190)

21-May-2014 16:33:09

[info]   ...

21-May-2014 16:33:09

[info] - compute zigzag indicator in stream w notification for each change  *** FAILED ***

21-May-2014 16:33:09

[info]   org.apache.spark.SparkException: Job aborted: Task 141.0:0 failed 1 times (most recent
failure: Exception failure: java.io.FileNotFoundException: http://10.10.1.9:62793/broadcast_1)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1020)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

21-May-2014 16:33:09

[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1018)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at scala.Option.foreach(Option.scala:236)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:604)

21-May-2014 16:33:09

[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:190)

21-May-2014 16:33:09

[info]   ...

21-May-2014 16:33:09

[info] - compute zigzag in stream where there is 1 key/RDDs but multiple key types exist

21-May-2014 16:33:09

[info] - compute zigzag in stream where RDDs have more than 1 key

21-May-2014 16:33:09

16:33:09.819 [spark-akka.actor.default-dispatcher-15] INFO  Remoting - Remoting shut down

21-May-2014 16:33:09

16:33:09.819 [spark-akka.actor.default-dispatcher-4] INFO  a.r.RemoteActorRefProvider$RemotingTerminator
- Remoting shut down.

21-May-2014 16:33:09

[info] Run completed in 52 seconds, 792 milliseconds.

21-May-2014 16:33:09

[info] Total number of tests run: 36

21-May-2014 16:33:09

[info] Suites: completed 8, aborted 0

21-May-2014 16:33:09

[info] Tests: succeeded 25, failed 11, canceled 0, ignored 0, pending 0




-Adrian


Mime
View raw message