beam-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #1560
Date Fri, 21 Sep 2018 21:43:33 GMT
See <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/1560/display/redirect?page=changes>

Changes:

[robbe.sneyders] Fix pipeline_test

[ankurgoenka] Add Portable PostCommit test status to PR template

[robbe.sneyders] Fix hamcrest docs dependency

------------------------------------------
[...truncated 30.16 MB...]
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4
missing tasks from ResultStage 547 (MapPartitionsRDD[2773] at map at TranslationUtils.java:129)
(first 15 tasks are for partitions Vector(0, 1, 2, 3))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding
task set 547.0 with 4 tasks
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task
0.0 in stage 547.0 (TID 463, localhost, executor driver, partition 0, PROCESS_LOCAL, 8308
bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task
1.0 in stage 547.0 (TID 464, localhost, executor driver, partition 1, PROCESS_LOCAL, 8308
bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task
2.0 in stage 547.0 (TID 465, localhost, executor driver, partition 2, PROCESS_LOCAL, 8308
bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - Starting task
3.0 in stage 547.0 (TID 466, localhost, executor driver, partition 3, PROCESS_LOCAL, 8308
bytes)
    [Executor task launch worker for task 463] INFO org.apache.spark.executor.Executor - Running
task 0.0 in stage 547.0 (TID 463)
    [Executor task launch worker for task 464] INFO org.apache.spark.executor.Executor - Running
task 1.0 in stage 547.0 (TID 464)
    [Executor task launch worker for task 466] INFO org.apache.spark.executor.Executor - Running
task 3.0 in stage 547.0 (TID 466)
    [Executor task launch worker for task 465] INFO org.apache.spark.executor.Executor - Running
task 2.0 in stage 547.0 (TID 465)
    [Executor task launch worker for task 464] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 463] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 464] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 463] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 465] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 465] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 465] INFO org.apache.spark.storage.BlockManager
- Found block rdd_2441_2 locally
    [Executor task launch worker for task 464] INFO org.apache.spark.storage.BlockManager
- Found block rdd_2441_1 locally
    [Executor task launch worker for task 463] INFO org.apache.spark.storage.BlockManager
- Found block rdd_2441_0 locally
    [Executor task launch worker for task 464] INFO org.apache.spark.storage.memory.MemoryStore
- Block rdd_2756_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 463] INFO org.apache.spark.storage.memory.MemoryStore
- Block rdd_2756_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 465] INFO org.apache.spark.storage.memory.MemoryStore
- Block rdd_2756_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2756_0
in memory on localhost:37925 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2756_1
in memory on localhost:37925 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2756_2
in memory on localhost:37925 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 465] INFO org.apache.spark.executor.Executor - Finished
task 2.0 in stage 547.0 (TID 465). 59881 bytes result sent to driver
    [Executor task launch worker for task 464] INFO org.apache.spark.executor.Executor - Finished
task 1.0 in stage 547.0 (TID 464). 59881 bytes result sent to driver
    [Executor task launch worker for task 463] INFO org.apache.spark.executor.Executor - Finished
task 0.0 in stage 547.0 (TID 463). 59881 bytes result sent to driver
    [Executor task launch worker for task 466] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 466] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 466] INFO org.apache.spark.storage.BlockManager
- Found block rdd_2441_3 locally
    [Executor task launch worker for task 466] INFO org.apache.spark.storage.memory.MemoryStore
- Block rdd_2756_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2756_3
in memory on localhost:37925 (size: 4.0 B, free: 13.5 GB)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task
2.0 in stage 547.0 (TID 465) in 18 ms on localhost (executor driver) (1/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task
1.0 in stage 547.0 (TID 464) in 18 ms on localhost (executor driver) (2/4)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task
0.0 in stage 547.0 (TID 463) in 19 ms on localhost (executor driver) (3/4)
    [Executor task launch worker for task 466] INFO org.apache.spark.executor.Executor - Finished
task 3.0 in stage 547.0 (TID 466). 59881 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task
3.0 in stage 547.0 (TID 466) in 25 ms on localhost (executor driver) (4/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet
547.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ResultStage
547 (foreach at UnboundedDataset.java:80) finished in 0.038 s
    [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - Job 34 finished:
foreach at UnboundedDataset.java:80, took 0.109978 s
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Finished job streaming
job 1537565649500 ms.2 from job set of time 1537565649500 ms
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Starting job streaming
job 1537565649500 ms.3 from job set of time 1537565649500 ms
    [streaming-job-executor-0] INFO org.apache.spark.SparkContext - Starting job: foreach
at UnboundedDataset.java:80
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering
RDD 2787 (mapToPair at GroupCombineFunctions.java:54)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering
RDD 2815 (mapToPair at GroupCombineFunctions.java:54)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Got job 35 (foreach
at UnboundedDataset.java:80) with 4 output partitions
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Final stage:
ResultStage 575 (foreach at UnboundedDataset.java:80)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Parents of final
stage: List(ShuffleMapStage 574, ShuffleMapStage 571, ShuffleMapStage 572, ShuffleMapStage
568, ShuffleMapStage 562, ShuffleMapStage 569, ShuffleMapStage 573, ShuffleMapStage 570, ShuffleMapStage
567)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Missing parents:
List(ShuffleMapStage 567)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage
565 (MapPartitionsRDD[2787] at mapToPair at GroupCombineFunctions.java:54), which has no missing
parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_108
stored as values in memory (estimated size 161.2 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_108_piece0
stored as bytes in memory (estimated size 35.2 KB, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_108_piece0
in memory on localhost:37925 (size: 35.2 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 108
from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4
missing tasks from ShuffleMapStage 565 (MapPartitionsRDD[2787] at mapToPair at GroupCombineFunctions.java:54)
(first 15 tasks are for partitions Vector(0, 1, 2, 3))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding
task set 565.0 with 4 tasks
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task
0.0 in stage 565.0 (TID 467, localhost, executor driver, partition 0, PROCESS_LOCAL, 8297
bytes)
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task
1.0 in stage 565.0 (TID 468, localhost, executor driver, partition 1, PROCESS_LOCAL, 8297
bytes)
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task
2.0 in stage 565.0 (TID 469, localhost, executor driver, partition 2, PROCESS_LOCAL, 8297
bytes)
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task
3.0 in stage 565.0 (TID 470, localhost, executor driver, partition 3, PROCESS_LOCAL, 8297
bytes)
    [Executor task launch worker for task 467] INFO org.apache.spark.executor.Executor - Running
task 0.0 in stage 565.0 (TID 467)
    [Executor task launch worker for task 468] INFO org.apache.spark.executor.Executor - Running
task 1.0 in stage 565.0 (TID 468)
    [Executor task launch worker for task 469] INFO org.apache.spark.executor.Executor - Running
task 2.0 in stage 565.0 (TID 469)
    [Executor task launch worker for task 470] INFO org.apache.spark.executor.Executor - Running
task 3.0 in stage 565.0 (TID 470)
    [Executor task launch worker for task 468] INFO org.apache.spark.storage.BlockManager
- Found block rdd_2559_1 locally
    [Executor task launch worker for task 470] INFO org.apache.spark.storage.BlockManager
- Found block rdd_2559_3 locally
    [Executor task launch worker for task 469] INFO org.apache.spark.storage.BlockManager
- Found block rdd_2559_2 locally
    [Executor task launch worker for task 467] INFO org.apache.spark.storage.BlockManager
- Found block rdd_2559_0 locally
    [Executor task launch worker for task 468] INFO org.apache.spark.executor.Executor - Finished
task 1.0 in stage 565.0 (TID 468). 59509 bytes result sent to driver
    [Executor task launch worker for task 470] INFO org.apache.spark.executor.Executor - Finished
task 3.0 in stage 565.0 (TID 470). 59509 bytes result sent to driver
    [Executor task launch worker for task 469] INFO org.apache.spark.executor.Executor - Finished
task 2.0 in stage 565.0 (TID 469). 59509 bytes result sent to driver
    [Executor task launch worker for task 467] INFO org.apache.spark.executor.Executor - Finished
task 0.0 in stage 565.0 (TID 467). 59509 bytes result sent to driver
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task
1.0 in stage 565.0 (TID 468) in 14 ms on localhost (executor driver) (1/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task
3.0 in stage 565.0 (TID 470) in 14 ms on localhost (executor driver) (2/4)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task
2.0 in stage 565.0 (TID 469) in 16 ms on localhost (executor driver) (3/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task
0.0 in stage 565.0 (TID 467) in 17 ms on localhost (executor driver) (4/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet
565.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage
565 (mapToPair at GroupCombineFunctions.java:54) finished in 0.026 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for
newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ResultStage
575, ShuffleMapStage 567)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage
567 (MapPartitionsRDD[2815] at mapToPair at GroupCombineFunctions.java:54), which has no missing
parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_109
stored as values in memory (estimated size 198.7 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_109_piece0
stored as bytes in memory (estimated size 45.0 KB, free 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_109_piece0
in memory on localhost:37925 (size: 45.0 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 109
from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 5
missing tasks from ShuffleMapStage 567 (MapPartitionsRDD[2815] at mapToPair at GroupCombineFunctions.java:54)
(first 15 tasks are for partitions Vector(0, 1, 2, 3, 4))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding
task set 567.0 with 5 tasks
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task
0.0 in stage 567.0 (TID 471, localhost, executor driver, partition 0, PROCESS_LOCAL, 8436
bytes)
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task
1.0 in stage 567.0 (TID 472, localhost, executor driver, partition 1, PROCESS_LOCAL, 8436
bytes)
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task
2.0 in stage 567.0 (TID 473, localhost, executor driver, partition 2, PROCESS_LOCAL, 8436
bytes)
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task
3.0 in stage 567.0 (TID 474, localhost, executor driver, partition 3, PROCESS_LOCAL, 8436
bytes)
    [Executor task launch worker for task 474] INFO org.apache.spark.executor.Executor - Running
task 3.0 in stage 567.0 (TID 474)
    [Executor task launch worker for task 471] INFO org.apache.spark.executor.Executor - Running
task 0.0 in stage 567.0 (TID 471)
    [Executor task launch worker for task 472] INFO org.apache.spark.executor.Executor - Running
task 1.0 in stage 567.0 (TID 472)
    [Executor task launch worker for task 473] INFO org.apache.spark.executor.Executor - Running
task 2.0 in stage 567.0 (TID 473)
    [Executor task launch worker for task 473] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 471] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 473] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 471] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 471] INFO org.apache.spark.storage.BlockManager
- Found block rdd_2484_0 locally
    [Executor task launch worker for task 473] INFO org.apache.spark.storage.BlockManager
- Found block rdd_2484_2 locally
    [Executor task launch worker for task 472] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 474] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Getting 0 non-empty blocks out of 4 blocks
    [Executor task launch worker for task 474] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 472] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 474] INFO org.apache.spark.storage.BlockManager
- Found block rdd_2484_3 locally
    [Executor task launch worker for task 473] INFO org.apache.spark.storage.memory.MemoryStore
- Block rdd_2799_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 471] INFO org.apache.spark.storage.memory.MemoryStore
- Block rdd_2799_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 472] INFO org.apache.spark.storage.BlockManager
- Found block rdd_2484_1 locally
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_2
in memory on localhost:37925 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_0
in memory on localhost:37925 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 474] INFO org.apache.spark.storage.memory.MemoryStore
- Block rdd_2799_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_3
in memory on localhost:37925 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 472] INFO org.apache.spark.storage.memory.MemoryStore
- Block rdd_2799_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_1
in memory on localhost:37925 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 473] INFO org.apache.spark.executor.Executor - Finished
task 2.0 in stage 567.0 (TID 473). 59939 bytes result sent to driver
    [Executor task launch worker for task 474] INFO org.apache.spark.executor.Executor - Finished
task 3.0 in stage 567.0 (TID 474). 59939 bytes result sent to driver
    [Executor task launch worker for task 471] INFO org.apache.spark.executor.Executor - Finished
task 0.0 in stage 567.0 (TID 471). 59939 bytes result sent to driver
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task
4.0 in stage 567.0 (TID 475, localhost, executor driver, partition 4, PROCESS_LOCAL, 7968
bytes)
    [Executor task launch worker for task 472] INFO org.apache.spark.executor.Executor - Finished
task 1.0 in stage 567.0 (TID 472). 59939 bytes result sent to driver
    [Executor task launch worker for task 475] INFO org.apache.spark.executor.Executor - Running
task 4.0 in stage 567.0 (TID 475)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task
1.0 in stage 567.0 (TID 472) in 18 ms on localhost (executor driver) (1/5)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task
3.0 in stage 567.0 (TID 474) in 17 ms on localhost (executor driver) (2/5)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task
0.0 in stage 567.0 (TID 471) in 18 ms on localhost (executor driver) (3/5)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task
2.0 in stage 567.0 (TID 473) in 19 ms on localhost (executor driver) (4/5)
    [Executor task launch worker for task 475] INFO org.apache.spark.executor.Executor - Finished
task 4.0 in stage 567.0 (TID 475). 59466 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task
4.0 in stage 567.0 (TID 475) in 15 ms on localhost (executor driver) (5/5)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet
567.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage
567 (mapToPair at GroupCombineFunctions.java:54) finished in 0.040 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for
newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ResultStage
575)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ResultStage
575 (MapPartitionsRDD[2844] at map at TranslationUtils.java:129), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_110
stored as values in memory (estimated size 236.2 KB, free 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_110_piece0
stored as bytes in memory (estimated size 56.3 KB, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_110_piece0
in memory on localhost:37925 (size: 56.3 KB, free: 13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 110
from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4
missing tasks from ResultStage 575 (MapPartitionsRDD[2844] at map at TranslationUtils.java:129)
(first 15 tasks are for partitions Vector(0, 1, 2, 3))
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding
task set 575.0 with 4 tasks
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task
0.0 in stage 575.0 (TID 476, localhost, executor driver, partition 0, PROCESS_LOCAL, 8308
bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task
1.0 in stage 575.0 (TID 477, localhost, executor driver, partition 1, PROCESS_LOCAL, 8308
bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task
2.0 in stage 575.0 (TID 478, localhost, executor driver, partition 2, PROCESS_LOCAL, 8308
bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task
3.0 in stage 575.0 (TID 479, localhost, executor driver, partition 3, PROCESS_LOCAL, 8308
bytes)
    [Executor task launch worker for task 476] INFO org.apache.spark.executor.Executor - Running
task 0.0 in stage 575.0 (TID 476)
    [Executor task launch worker for task 477] INFO org.apache.spark.executor.Executor - Running
task 1.0 in stage 575.0 (TID 477)
    [Executor task launch worker for task 479] INFO org.apache.spark.executor.Executor - Running
task 3.0 in stage 575.0 (TID 479)
    [Executor task launch worker for task 478] INFO org.apache.spark.executor.Executor - Running
task 2.0 in stage 575.0 (TID 478)
    [Executor task launch worker for task 476] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 477] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 479] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 476] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 477] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 479] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 476] INFO org.apache.spark.storage.BlockManager
- Found block rdd_2512_0 locally
    [Executor task launch worker for task 479] INFO org.apache.spark.storage.BlockManager
- Found block rdd_2512_3 locally
    [Executor task launch worker for task 477] INFO org.apache.spark.storage.BlockManager
- Found block rdd_2512_1 locally
    [Executor task launch worker for task 476] INFO org.apache.spark.storage.memory.MemoryStore
- Block rdd_2827_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 479] INFO org.apache.spark.storage.memory.MemoryStore
- Block rdd_2827_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 477] INFO org.apache.spark.storage.memory.MemoryStore
- Block rdd_2827_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_0
in memory on localhost:37925 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_3
in memory on localhost:37925 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_1
in memory on localhost:37925 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 478] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Getting 0 non-empty blocks out of 5 blocks
    [Executor task launch worker for task 478] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator
- Started 0 remote fetches in 0 ms
    [Executor task launch worker for task 478] INFO org.apache.spark.storage.BlockManager
- Found block rdd_2512_2 locally
    [Executor task launch worker for task 478] INFO org.apache.spark.storage.memory.MemoryStore
- Block rdd_2827_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_2
in memory on localhost:37925 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 476] INFO org.apache.spark.executor.Executor - Finished
task 0.0 in stage 575.0 (TID 476). 59881 bytes result sent to driver
    [Executor task launch worker for task 479] INFO org.apache.spark.executor.Executor - Finished
task 3.0 in stage 575.0 (TID 479). 59881 bytes result sent to driver
    [Executor task launch worker for task 477] INFO org.apache.spark.executor.Executor - Finished
task 1.0 in stage 575.0 (TID 477). 59881 bytes result sent to driver
    [Executor task launch worker for task 478] INFO org.apache.spark.executor.Executor - Finished
task 2.0 in stage 575.0 (TID 478). 59881 bytes result sent to driver
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task
0.0 in stage 575.0 (TID 476) in 17 ms on localhost (executor driver) (1/4)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task
3.0 in stage 575.0 (TID 479) in 18 ms on localhost (executor driver) (2/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task
1.0 in stage 575.0 (TID 477) in 18 ms on localhost (executor driver) (3/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task
2.0 in stage 575.0 (TID 478) in 19 ms on localhost (executor driver) (4/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet
575.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ResultStage
575 (foreach at UnboundedDataset.java:80) finished in 0.031 s
    [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - Job 35 finished:
foreach at UnboundedDataset.java:80, took 0.110447 s
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Finished job streaming
job 1537565649500 ms.3 from job set of time 1537565649500 ms
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Total delay: 6.899
s for time 1537565649500 ms (execution: 0.509 s)
    [Test worker] INFO org.apache.spark.streaming.scheduler.JobScheduler - Stopped JobScheduler
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@1327a632{/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@386a0777{/streaming/batch,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@64763e47{/static/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.apache.spark.streaming.StreamingContext - StreamingContext stopped
successfully
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@855a9d9{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4040
    [dispatcher-event-loop-0] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint
stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint
- OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext

Gradle Test Executor 286 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-5f882160-9b6e-444c-98bf-9cf5a9baee0c

org.apache.beam.runners.spark.translation.streaming.StreamingSourceMetricsTest > testUnboundedSourceMetrics
STANDARD_ERROR
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@4f637215{HTTP/1.1,[http/1.1]}{127.0.0.1:4041}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4041
    [dispatcher-event-loop-1] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint
stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint
- OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext

Gradle Test Executor 288 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming FAILED
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-0975deea-5487-47e0-be78-394c52e82323

14 tests completed, 2 failed
Finished generating test XML results (0.126 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/test-results/validatesRunnerStreaming>
Generating HTML test report...
Finished generating test html results (0.166 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerStreaming>
:beam-runners-spark:validatesRunnerStreaming (Thread[Task worker for ':' Thread 3,5,main])
completed. Took 10 mins 31.858 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-runners-spark:validatesRunnerStreaming'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerStreaming/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log
output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 16m 55s
40 actionable tasks: 36 executed, 4 from cache

Publishing build scan...
https://gradle.com/s/dvf4rbdwyl5sg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Mime
View raw message