spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Haiyang Fu <haiyangfu...@gmail.com>
Subject Re: Re: java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
Date Fri, 01 Aug 2014 04:05:55 GMT
Glad to help you


On Fri, Aug 1, 2014 at 11:28 AM, Bin <wubin_phight@126.com> wrote:

> Hi Haiyang,
>
> Thanks, it really is the reason.
>
> Best,
> Bin
>
>
> 在 2014-07-31 08:05:34,"Haiyang Fu" <haiyangfu512@gmail.com> 写道:
>
> Have you tried to increase the dirver memory?
>
>
> On Thu, Jul 31, 2014 at 3:54 PM, Bin <wubin_phight@126.com> wrote:
>
>> Hi All,
>>
>> The data size of my task is about 30mb. It runs smoothly in local mode.
>> However, when I submit it to the cluster, it throws the titled error
>> (Please see below for the complete output).
>>
>> Actually, my output is almost the same with
>> http://stackoverflow.com/questions/24080891/spark-program-hangs-at-job-finished-toarray-workers-throw-java-util-concurren.
I
>> also toArray my data, which was the reason of his case.
>>
>> However, how come it runs OK in local but not in the cluster? The memory
>> of each worker is over 60g, and my run command is:
>>
>> "$SPARK_HOME/bin/spark-class org.apache.spark.deploy.Client launch
>> spark://10.196.135.101:7077 $jar_path $programname -Dspark.
>> master=spark://10.196.135.101:7077 -Dspark.cores.max=300
>> -Dspark.executor.memory=20g -spark.jars=$jar_path -Dspark.default.parallelism=100
>>  -Dspark.hadoop.hadoop.job.ugi=$username,$groupname  -Dspark.app.name=$appname
>> $in_path $scala_out_path"
>>
>> Looking for help and thanks a lot!
>>
>> Below please find the complete output:
>>
>> 14/07/31 15:06:53 WARN Configuration: DEPRECATED: hadoop-site.xml found in the classpath.
Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml
to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
>> 14/07/31 15:06:53 INFO SecurityManager: Changing view acls to: spark
>> 14/07/31 15:06:53 INFO SecurityManager: SecurityManager: authentication disabled;
ui acls disabled; users with view permissions: Set(spark)
>> 14/07/31 15:06:53 INFO Slf4jLogger: Slf4jLogger started
>> 14/07/31 15:06:53 INFO Remoting: Starting remoting
>> 14/07/31 15:06:54 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkExecutor@tdw-10-215-140-22:39446]
>> 14/07/31 15:06:54 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkExecutor@tdw-10-215-140-22:39446]
>> 14/07/31 15:06:54 INFO CoarseGrainedExecutorBackend: Connecting to driver: akka.tcp://spark@tdw-10-196-135-106:38502/user/CoarseGrainedScheduler
>> 14/07/31 15:06:54 INFO WorkerWatcher: Connecting to worker akka.tcp://sparkWorker@tdw-10-215-140-22:34755/user/Worker
>> 14/07/31 15:06:54 INFO WorkerWatcher: Successfully connected to akka.tcp://sparkWorker@tdw-10-215-140-22:34755/user/Worker
>> 14/07/31 15:06:56 INFO CoarseGrainedExecutorBackend: Successfully registered with
driver
>> 14/07/31 15:06:56 INFO SecurityManager: Changing view acls to: spark
>> 14/07/31 15:06:56 INFO SecurityManager: SecurityManager: authentication disabled;
ui acls disabled; users with view permissions: Set(spark)
>> 14/07/31 15:06:56 INFO Slf4jLogger: Slf4jLogger started
>> 14/07/31 15:06:56 INFO Remoting: Starting remoting
>> 14/07/31 15:06:56 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://spark@tdw-10-215-140-22:56708]
>> 14/07/31 15:06:56 INFO Remoting: Remoting now listens on addresses: [akka.tcp://spark@tdw-10-215-140-22:56708]
>> 14/07/31 15:06:56 INFO SparkEnv: Connecting to MapOutputTracker: akka.tcp://spark@tdw-10-196-135-106:38502/user/MapOutputTracker
>> 14/07/31 15:06:58 INFO SparkEnv: Connecting to BlockManagerMaster: akka.tcp://spark@tdw-10-196-135-106:38502/user/BlockManagerMaster
>> 14/07/31 15:06:59 INFO DiskBlockManager: Created local directory at /data1/sparkenv/local/spark-local-20140731150659-3f12
>> 14/07/31 15:06:59 INFO DiskBlockManager: Created local directory at /data2/sparkenv/local/spark-local-20140731150659-1602
>> 14/07/31 15:06:59 INFO DiskBlockManager: Created local directory at /data3/sparkenv/local/spark-local-20140731150659-d213
>> 14/07/31 15:06:59 INFO DiskBlockManager: Created local directory at /data4/sparkenv/local/spark-local-20140731150659-f42e
>> 14/07/31 15:06:59 INFO DiskBlockManager: Created local directory at /data5/sparkenv/local/spark-local-20140731150659-63d0
>> 14/07/31 15:06:59 INFO DiskBlockManager: Created local directory at /data6/sparkenv/local/spark-local-20140731150659-9003
>> 14/07/31 15:06:59 INFO DiskBlockManager: Created local directory at /data7/sparkenv/local/spark-local-20140731150659-f260
>> 14/07/31 15:06:59 INFO DiskBlockManager: Created local directory at /data8/sparkenv/local/spark-local-20140731150659-6334
>> 14/07/31 15:06:59 INFO DiskBlockManager: Created local directory at /data9/sparkenv/local/spark-local-20140731150659-3af4
>> 14/07/31 15:06:59 INFO DiskBlockManager: Created local directory at /data10/sparkenv/local/spark-local-20140731150659-133d
>> 14/07/31 15:06:59 INFO DiskBlockManager: Created local directory at /data11/sparkenv/local/spark-local-20140731150659-ed08
>> 14/07/31 15:06:59 INFO MemoryStore: MemoryStore started with capacity 11.5 GB.
>> 14/07/31 15:06:59 INFO ConnectionManager: Bound socket to port 35127 with id = ConnectionManagerId(tdw-10-215-140-22,35127)
>> 14/07/31 15:06:59 INFO BlockManagerMaster: Trying to register BlockManager
>> 14/07/31 15:07:00 INFO BlockManagerMaster: Registered BlockManager
>> 14/07/31 15:07:00 INFO HttpFileServer: HTTP File server directory is /tmp/spark-0914d215-dd22-4d5e-9ec0-724937dbfd8b
>> 14/07/31 15:07:00 INFO HttpServer: Starting HTTP Server
>> 14/07/31 15:07:26 INFO CoarseGrainedExecutorBackend: Got assigned task 12
>> 14/07/31 15:07:26 INFO CoarseGrainedExecutorBackend: Got assigned task 25
>> 14/07/31 15:07:26 INFO Executor: Running task ID 25
>> 14/07/31 15:07:26 INFO Executor: Running task ID 12
>> 14/07/31 15:07:26 INFO Executor: Fetching hdfs://tdw-10-196-135-101:54310/user/teg/gdt/tj/test/adsorption_2.10-1.0.jar
with timestamp 1406790410442
>> 14/07/31 15:07:26 INFO Executor: Adding file:/data/home/spark/spark-1.0.0-bin-0.20.2-cdh3u3/work/app-20140731150652-4911/8/./adsorption_2.10-1.0.jar
to class loader
>> 14/07/31 15:07:26 INFO HttpBroadcast: Started reading broadcast variable 0
>> 14/07/31 15:07:28 INFO MemoryStore: ensureFreeSpace(102650) called with curMem=0,
maxMem=12348240691
>> 14/07/31 15:07:28 INFO MemoryStore: Block broadcast_0 stored as values to memory
(estimated size 100.2 KB, free 11.5 GB)
>> 14/07/31 15:07:28 INFO HttpBroadcast: Reading broadcast variable 0 took 1.702291406
s
>> 14/07/31 15:07:28 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:28 INFO HadoopRDD: Input split: hdfs://tdw-10-196-135-101:54310/user/teg/gdt/tj/test/pre/scala_out/part-00072:0+197955
>> 14/07/31 15:07:28 INFO HadoopRDD: Input split: hdfs://tdw-10-196-135-101:54310/user/teg/gdt/tj/test/pre/scala_out/part-00062:0+218630
>> 14/07/31 15:07:28 WARN NativeCodeLoader: Unable to load native-hadoop library for
your platform... using builtin-java classes where applicable
>> 14/07/31 15:07:28 WARN LoadSnappy: Snappy native library not loaded
>> 14/07/31 15:07:29 INFO Executor: Serialized size of result for 12 is 887
>> 14/07/31 15:07:29 INFO Executor: Serialized size of result for 25 is 887
>> 14/07/31 15:07:29 INFO Executor: Sending result for 25 directly to driver
>> 14/07/31 15:07:29 INFO Executor: Sending result for 12 directly to driver
>> 14/07/31 15:07:29 INFO Executor: Finished task ID 12
>> 14/07/31 15:07:29 INFO Executor: Finished task ID 25
>> 14/07/31 15:07:30 INFO CoarseGrainedExecutorBackend: Got assigned task 30
>> 14/07/31 15:07:30 INFO Executor: Running task ID 30
>> 14/07/31 15:07:30 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:30 INFO HadoopRDD: Input split: hdfs://tdw-10-196-135-101:54310/user/teg/gdt/tj/test/pre/scala_out/part-00084:0+196433
>> 14/07/31 15:07:30 INFO Executor: Serialized size of result for 30 is 887
>> 14/07/31 15:07:30 INFO Executor: Sending result for 30 directly to driver
>> 14/07/31 15:07:30 INFO Executor: Finished task ID 30
>> 14/07/31 15:07:30 INFO CoarseGrainedExecutorBackend: Got assigned task 31
>> 14/07/31 15:07:30 INFO Executor: Running task ID 31
>> 14/07/31 15:07:30 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:30 INFO HadoopRDD: Input split: hdfs://tdw-10-196-135-101:54310/user/teg/gdt/tj/test/pre/scala_out/part-00089:0+190194
>> 14/07/31 15:07:30 INFO Executor: Serialized size of result for 31 is 887
>> 14/07/31 15:07:30 INFO Executor: Sending result for 31 directly to driver
>> 14/07/31 15:07:30 INFO Executor: Finished task ID 31
>> 14/07/31 15:07:31 INFO CoarseGrainedExecutorBackend: Got assigned task 54
>> 14/07/31 15:07:31 INFO Executor: Running task ID 54
>> 14/07/31 15:07:31 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:31 INFO HadoopRDD: Input split: hdfs://tdw-10-196-135-101:54310/user/teg/gdt/tj/test/pre/scala_out/part-00096:0+153443
>> 14/07/31 15:07:31 INFO CoarseGrainedExecutorBackend: Got assigned task 55
>> 14/07/31 15:07:31 INFO Executor: Running task ID 55
>> 14/07/31 15:07:31 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:31 INFO HadoopRDD: Input split: hdfs://tdw-10-196-135-101:54310/user/teg/gdt/tj/test/pre/scala_out/part-00139:0+174726
>> 14/07/31 15:07:31 INFO Executor: Serialized size of result for 54 is 887
>> 14/07/31 15:07:31 INFO Executor: Sending result for 54 directly to driver
>> 14/07/31 15:07:31 INFO Executor: Finished task ID 54
>> 14/07/31 15:07:31 INFO Executor: Serialized size of result for 55 is 887
>> 14/07/31 15:07:31 INFO Executor: Sending result for 55 directly to driver
>> 14/07/31 15:07:31 INFO Executor: Finished task ID 55
>> 14/07/31 15:07:32 INFO CoarseGrainedExecutorBackend: Got assigned task 76
>> 14/07/31 15:07:32 INFO Executor: Running task ID 76
>> 14/07/31 15:07:32 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:32 INFO CoarseGrainedExecutorBackend: Got assigned task 79
>> 14/07/31 15:07:32 INFO Executor: Running task ID 79
>> 14/07/31 15:07:32 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:32 INFO HadoopRDD: Input split: hdfs://tdw-10-196-135-101:54310/user/teg/gdt/tj/test/pre/scala_out/part-00149:0+134758
>> 14/07/31 15:07:32 INFO HadoopRDD: Input split: hdfs://tdw-10-196-135-101:54310/user/teg/gdt/tj/test/pre/scala_out/part-00157:0+103176
>> 14/07/31 15:07:32 INFO Executor: Serialized size of result for 79 is 887
>> 14/07/31 15:07:32 INFO Executor: Sending result for 79 directly to driver
>> 14/07/31 15:07:32 INFO Executor: Finished task ID 79
>> 14/07/31 15:07:32 INFO Executor: Serialized size of result for 76 is 887
>> 14/07/31 15:07:32 INFO Executor: Sending result for 76 directly to driver
>> 14/07/31 15:07:32 INFO Executor: Finished task ID 76
>> 14/07/31 15:07:34 INFO CoarseGrainedExecutorBackend: Got assigned task 99
>> 14/07/31 15:07:34 INFO Executor: Running task ID 99
>> 14/07/31 15:07:34 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:34 INFO HadoopRDD: Input split: hdfs://tdw-10-196-135-101:54310/user/teg/gdt/tj/test/pre/scala_out/part-00167:0+51005
>> 14/07/31 15:07:34 INFO Executor: Serialized size of result for 99 is 887
>> 14/07/31 15:07:34 INFO Executor: Sending result for 99 directly to driver
>> 14/07/31 15:07:34 INFO Executor: Finished task ID 99
>> 14/07/31 15:07:39 INFO CoarseGrainedExecutorBackend: Got assigned task 181
>> 14/07/31 15:07:39 INFO Executor: Running task ID 181
>> 14/07/31 15:07:39 INFO CoarseGrainedExecutorBackend: Got assigned task 196
>> 14/07/31 15:07:39 INFO Executor: Running task ID 196
>> 14/07/31 15:07:39 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:39 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:39 INFO MapOutputTrackerWorker: Updating epoch to 1 and clearing cache
>> 14/07/31 15:07:39 INFO ConnectionManager: Accepted connection from [tdw-10-215-140-24/10.215.140.24]
>> 14/07/31 15:07:39 INFO ConnectionManager: Accepted connection from [tdw-10-196-135-105/10.196.135.105]
>> 14/07/31 15:07:39 INFO ConnectionManager: Accepted connection from [tdw-10-215-140-21/10.215.140.21]
>> 14/07/31 15:07:39 INFO ConnectionManager: Accepted connection from [tdw-10-215-140-12/10.215.140.12]
>> 46.947: [GC 10486272K->32824K(40196096K), 0.0347340 secs]
>> 14/07/31 15:07:39 INFO ConnectionManager: Accepted connection from [tdw-10-215-140-13/10.215.140.13]
>> 14/07/31 15:07:39 INFO ConnectionManager: Accepted connection from [tdw-10-215-140-23/10.215.140.23]
>> 14/07/31 15:07:39 INFO SendingConnection: Initiating connection to [tdw-10-215-140-13/10.215.140.13:58657]
>> 14/07/31 15:07:39 INFO SendingConnection: Initiating connection to [tdw-10-215-140-23/10.215.140.23:39188]
>> 14/07/31 15:07:39 INFO SendingConnection: Initiating connection to [tdw-10-215-140-21/10.215.140.21:36128]
>> 14/07/31 15:07:39 INFO SendingConnection: Initiating connection to [tdw-10-215-140-12/10.215.140.12:33380]
>> 14/07/31 15:07:39 INFO SendingConnection: Initiating connection to [tdw-10-196-135-105/10.196.135.105:36859]
>> 14/07/31 15:07:39 INFO SendingConnection: Initiating connection to [tdw-10-215-140-24/10.215.140.24:49100]
>> 14/07/31 15:07:39 INFO SendingConnection: Connected to [tdw-10-215-140-23/10.215.140.23:39188],
2 messages pending
>> 14/07/31 15:07:39 INFO SendingConnection: Connected to [tdw-10-215-140-13/10.215.140.13:58657],
2 messages pending
>> 14/07/31 15:07:39 INFO SendingConnection: Connected to [tdw-10-196-135-105/10.196.135.105:36859],
2 messages pending
>> 14/07/31 15:07:39 INFO SendingConnection: Connected to [tdw-10-215-140-12/10.215.140.12:33380],
1 messages pending
>> 14/07/31 15:07:39 INFO SendingConnection: Connected to [tdw-10-215-140-21/10.215.140.21:36128],
1 messages pending
>> 14/07/31 15:07:39 INFO SendingConnection: Connected to [tdw-10-215-140-24/10.215.140.24:49100],
2 messages pending
>> 14/07/31 15:07:39 INFO CacheManager: Partition rdd_6_10 not found, computing it
>> 14/07/31 15:07:39 INFO MapOutputTrackerWorker: Don't have map outputs for shuffle
0, fetching them
>> 14/07/31 15:07:39 INFO MapOutputTrackerWorker: Doing the fetch; tracker actor = Actor[akka.tcp://spark@tdw-10-196-135-106:38502/user/MapOutputTracker#-128956169]
>> 14/07/31 15:07:39 INFO CacheManager: Partition rdd_6_25 not found, computing it
>> 14/07/31 15:07:39 INFO MapOutputTrackerWorker: Don't have map outputs for shuffle
0, fetching them
>> 14/07/31 15:07:39 INFO ConnectionManager: Accepted connection from [tdw-10-215-140-17/10.215.140.17]
>> 14/07/31 15:07:39 INFO ConnectionManager: Accepted connection from [tdw-10-215-140-25/10.215.140.25]
>> 14/07/31 15:07:39 INFO SendingConnection: Initiating connection to [tdw-10-215-140-17/10.215.140.17:35040]
>> 14/07/31 15:07:39 INFO SendingConnection: Connected to [tdw-10-215-140-17/10.215.140.17:35040],
1 messages pending
>> 14/07/31 15:07:39 INFO ConnectionManager: Accepted connection from [tdw-10-215-140-18/10.215.140.18]
>> 14/07/31 15:07:39 INFO SendingConnection: Initiating connection to [tdw-10-215-140-25/10.215.140.25:50298]
>> 14/07/31 15:07:39 INFO SendingConnection: Connected to [tdw-10-215-140-25/10.215.140.25:50298],
2 messages pending
>> 14/07/31 15:07:39 INFO SendingConnection: Initiating connection to [tdw-10-215-140-18/10.215.140.18:33575]
>> 14/07/31 15:07:39 INFO SendingConnection: Connected to [tdw-10-215-140-18/10.215.140.18:33575],
1 messages pending
>> 14/07/31 15:07:39 INFO ConnectionManager: Accepted connection from [tdw-10-215-140-14/10.215.140.14]
>> 14/07/31 15:07:39 INFO SendingConnection: Initiating connection to [tdw-10-215-140-14/10.215.140.14:37220]
>> 14/07/31 15:07:39 INFO SendingConnection: Connected to [tdw-10-215-140-14/10.215.140.14:37220],
1 messages pending
>> 14/07/31 15:07:39 INFO MapOutputTrackerWorker: Got the output locations
>> 14/07/31 15:07:39 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight:
50331648, targetRequestSize: 10066329
>> 14/07/31 15:07:39 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight:
50331648, targetRequestSize: 10066329
>> 14/07/31 15:07:39 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 171
non-empty blocks out of 171 blocks
>> 14/07/31 15:07:39 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 171
non-empty blocks out of 171 blocks
>> 14/07/31 15:07:39 INFO SendingConnection: Initiating connection to [tdw-10-196-135-107/10.196.135.107:59290]
>> 14/07/31 15:07:39 INFO SendingConnection: Initiating connection to [tdw-10-215-140-11/10.215.140.11:50302]
>> 14/07/31 15:07:39 INFO SendingConnection: Connected to [tdw-10-215-140-11/10.215.140.11:50302],
1 messages pending
>> 14/07/31 15:07:39 INFO SendingConnection: Initiating connection to [tdw-10-196-135-106/10.196.135.106:34128]
>> 14/07/31 15:07:39 INFO ConnectionManager: Accepted connection from [tdw-10-215-140-11/10.215.140.11]
>> 14/07/31 15:07:39 INFO SendingConnection: Connected to [tdw-10-196-135-107/10.196.135.107:59290],
2 messages pending
>> 14/07/31 15:07:39 INFO SendingConnection: Connected to [tdw-10-196-135-106/10.196.135.106:34128],
1 messages pending
>> 14/07/31 15:07:39 INFO SendingConnection: Initiating connection to [tdw-10-215-140-15/10.215.140.15:50069]
>> 14/07/31 15:07:39 INFO SendingConnection: Connected to [tdw-10-215-140-15/10.215.140.15:50069],
1 messages pending
>> 14/07/31 15:07:40 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 14
remote fetches in 48 ms
>> 14/07/31 15:07:40 INFO ConnectionManager: Accepted connection from [tdw-10-215-140-15/10.215.140.15]
>> 14/07/31 15:07:40 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 14
remote fetches in 49 ms
>> 14/07/31 15:07:40 INFO ConnectionManager: Accepted connection from [tdw-10-215-140-16/10.215.140.16]
>> 14/07/31 15:07:40 INFO ConnectionManager: Accepted connection from [tdw-10-196-135-102/10.196.135.102]
>> 14/07/31 15:07:40 INFO SendingConnection: Initiating connection to [tdw-10-215-140-16/10.215.140.16:58648]
>> 14/07/31 15:07:40 INFO SendingConnection: Connected to [tdw-10-215-140-16/10.215.140.16:58648],
1 messages pending
>> 14/07/31 15:07:40 INFO SendingConnection: Initiating connection to [tdw-10-196-135-102/10.196.135.102:45729]
>> 14/07/31 15:07:40 INFO SendingConnection: Connected to [tdw-10-196-135-102/10.196.135.102:45729],
1 messages pending
>> 14/07/31 15:07:42 INFO ConnectionManager: Accepted connection from [tdw-10-196-135-106/10.196.135.106]
>> 14/07/31 15:07:44 INFO ConnectionManager: Accepted connection from [tdw-10-196-135-107/10.196.135.107]
>> 14/07/31 15:07:45 INFO MemoryStore: ensureFreeSpace(1922882) called with curMem=102650,
maxMem=12348240691
>> 14/07/31 15:07:45 INFO MemoryStore: Block rdd_6_25 stored as values to memory (estimated
size 1877.8 KB, free 11.5 GB)
>> 14/07/31 15:07:46 INFO MemoryStore: ensureFreeSpace(1912396) called with curMem=2025532,
maxMem=12348240691
>> 14/07/31 15:07:46 INFO MemoryStore: Block rdd_6_10 stored as values to memory (estimated
size 1867.6 KB, free 11.5 GB)
>> 14/07/31 15:07:46 INFO BlockManagerMaster: Updated info of block rdd_6_10
>> 14/07/31 15:07:46 INFO BlockManagerMaster: Updated info of block rdd_6_25
>> 14/07/31 15:07:46 INFO Executor: Serialized size of result for 181 is 421363
>> 14/07/31 15:07:46 INFO Executor: Serialized size of result for 196 is 421522
>> 14/07/31 15:07:46 INFO Executor: Sending result for 181 directly to driver
>> 14/07/31 15:07:46 INFO Executor: Sending result for 196 directly to driver
>> 14/07/31 15:07:46 INFO Executor: Finished task ID 181
>> 14/07/31 15:07:46 INFO Executor: Finished task ID 196
>> 14/07/31 15:07:50 INFO CoarseGrainedExecutorBackend: Got assigned task 219
>> 14/07/31 15:07:50 INFO Executor: Running task ID 219
>> 14/07/31 15:07:50 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:50 INFO CacheManager: Partition rdd_6_48 not found, computing it
>> 14/07/31 15:07:50 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight:
50331648, targetRequestSize: 10066329
>> 14/07/31 15:07:50 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 171
non-empty blocks out of 171 blocks
>> 14/07/31 15:07:50 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 14
remote fetches in 19 ms
>> 14/07/31 15:07:50 INFO CoarseGrainedExecutorBackend: Got assigned task 225
>> 14/07/31 15:07:50 INFO Executor: Running task ID 225
>> 14/07/31 15:07:50 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:50 INFO CacheManager: Partition rdd_6_54 not found, computing it
>> 14/07/31 15:07:50 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight:
50331648, targetRequestSize: 10066329
>> 14/07/31 15:07:50 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 171
non-empty blocks out of 171 blocks
>> 14/07/31 15:07:50 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 14
remote fetches in 15 ms
>> 14/07/31 15:07:51 INFO MemoryStore: ensureFreeSpace(1927469) called with curMem=3937928,
maxMem=12348240691
>> 14/07/31 15:07:51 INFO MemoryStore: Block rdd_6_48 stored as values to memory (estimated
size 1882.3 KB, free 11.5 GB)
>> 14/07/31 15:07:51 INFO BlockManagerMaster: Updated info of block rdd_6_48
>> 14/07/31 15:07:51 INFO Executor: Serialized size of result for 219 is 424342
>> 14/07/31 15:07:51 INFO Executor: Sending result for 219 directly to driver
>> 14/07/31 15:07:51 INFO Executor: Finished task ID 219
>> 14/07/31 15:07:51 INFO MemoryStore: ensureFreeSpace(1909775) called with curMem=5865397,
maxMem=12348240691
>> 14/07/31 15:07:51 INFO MemoryStore: Block rdd_6_54 stored as values to memory (estimated
size 1865.0 KB, free 11.5 GB)
>> 14/07/31 15:07:51 INFO BlockManagerMaster: Updated info of block rdd_6_54
>> 14/07/31 15:07:51 INFO Executor: Serialized size of result for 225 is 421546
>> 14/07/31 15:07:51 INFO Executor: Sending result for 225 directly to driver
>> 14/07/31 15:07:51 INFO Executor: Finished task ID 225
>> 14/07/31 15:07:53 INFO CoarseGrainedExecutorBackend: Got assigned task 251
>> 14/07/31 15:07:53 INFO Executor: Running task ID 251
>> 14/07/31 15:07:53 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:53 INFO CacheManager: Partition rdd_6_80 not found, computing it
>> 14/07/31 15:07:53 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight:
50331648, targetRequestSize: 10066329
>> 14/07/31 15:07:53 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 171
non-empty blocks out of 171 blocks
>> 14/07/31 15:07:53 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 14
remote fetches in 15 ms
>> 14/07/31 15:07:53 INFO MemoryStore: ensureFreeSpace(1927469) called with curMem=7775172,
maxMem=12348240691
>> 14/07/31 15:07:53 INFO MemoryStore: Block rdd_6_80 stored as values to memory (estimated
size 1882.3 KB, free 11.5 GB)
>> 14/07/31 15:07:53 INFO BlockManagerMaster: Updated info of block rdd_6_80
>> 14/07/31 15:07:53 INFO Executor: Serialized size of result for 251 is 424634
>> 14/07/31 15:07:53 INFO Executor: Sending result for 251 directly to driver
>> 14/07/31 15:07:53 INFO Executor: Finished task ID 251
>> 14/07/31 15:07:54 INFO CoarseGrainedExecutorBackend: Got assigned task 259
>> 14/07/31 15:07:54 INFO Executor: Running task ID 259
>> 14/07/31 15:07:54 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:54 INFO CacheManager: Partition rdd_6_88 not found, computing it
>> 14/07/31 15:07:54 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight:
50331648, targetRequestSize: 10066329
>> 14/07/31 15:07:54 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 171
non-empty blocks out of 171 blocks
>> 14/07/31 15:07:54 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 14
remote fetches in 13 ms
>> 14/07/31 15:07:54 INFO MemoryStore: ensureFreeSpace(1921571) called with curMem=9702641,
maxMem=12348240691
>> 14/07/31 15:07:54 INFO MemoryStore: Block rdd_6_88 stored as values to memory (estimated
size 1876.5 KB, free 11.5 GB)
>> 14/07/31 15:07:54 INFO BlockManagerMaster: Updated info of block rdd_6_88
>> 14/07/31 15:07:54 INFO Executor: Serialized size of result for 259 is 418167
>> 14/07/31 15:07:54 INFO Executor: Sending result for 259 directly to driver
>> 14/07/31 15:07:54 INFO Executor: Finished task ID 259
>> 14/07/31 15:07:56 INFO CoarseGrainedExecutorBackend: Got assigned task 273
>> 14/07/31 15:07:56 INFO Executor: Running task ID 273
>> 14/07/31 15:07:56 INFO CoarseGrainedExecutorBackend: Got assigned task 290
>> 14/07/31 15:07:56 INFO Executor: Running task ID 290
>> 14/07/31 15:07:56 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:56 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:56 INFO BlockManager: Found block rdd_6_10 locally
>> 14/07/31 15:07:56 INFO BlockManager: Found block rdd_6_25 locally
>> 14/07/31 15:07:56 INFO Executor: Serialized size of result for 273 is 887
>> 14/07/31 15:07:56 INFO Executor: Sending result for 273 directly to driver
>> 14/07/31 15:07:56 INFO Executor: Finished task ID 273
>> 14/07/31 15:07:56 INFO Executor: Serialized size of result for 290 is 887
>> 14/07/31 15:07:56 INFO Executor: Sending result for 290 directly to driver
>> 14/07/31 15:07:56 INFO Executor: Finished task ID 290
>> 14/07/31 15:07:57 INFO CoarseGrainedExecutorBackend: Got assigned task 308
>> 14/07/31 15:07:57 INFO Executor: Running task ID 308
>> 14/07/31 15:07:57 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:57 INFO BlockManager: Found block rdd_6_48 locally
>> 14/07/31 15:07:57 INFO CoarseGrainedExecutorBackend: Got assigned task 311
>> 14/07/31 15:07:57 INFO Executor: Running task ID 311
>> 14/07/31 15:07:57 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:57 INFO BlockManager: Found block rdd_6_54 locally
>> 14/07/31 15:07:57 INFO Executor: Serialized size of result for 308 is 887
>> 14/07/31 15:07:57 INFO Executor: Sending result for 308 directly to driver
>> 14/07/31 15:07:57 INFO Executor: Finished task ID 308
>> 14/07/31 15:07:57 INFO Executor: Serialized size of result for 311 is 887
>> 14/07/31 15:07:57 INFO Executor: Sending result for 311 directly to driver
>> 14/07/31 15:07:57 INFO Executor: Finished task ID 311
>> 14/07/31 15:07:58 INFO CoarseGrainedExecutorBackend: Got assigned task 339
>> 14/07/31 15:07:58 INFO Executor: Running task ID 339
>> 14/07/31 15:07:58 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:58 INFO BlockManager: Found block rdd_6_80 locally
>> 14/07/31 15:07:58 INFO CoarseGrainedExecutorBackend: Got assigned task 341
>> 14/07/31 15:07:58 INFO Executor: Running task ID 341
>> 14/07/31 15:07:58 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:58 INFO BlockManager: Found block rdd_6_88 locally
>> 14/07/31 15:07:58 INFO Executor: Serialized size of result for 339 is 887
>> 14/07/31 15:07:58 INFO Executor: Sending result for 339 directly to driver
>> 14/07/31 15:07:58 INFO Executor: Finished task ID 339
>> 14/07/31 15:07:58 INFO Executor: Serialized size of result for 341 is 887
>> 14/07/31 15:07:58 INFO Executor: Sending result for 341 directly to driver
>> 14/07/31 15:07:58 INFO Executor: Finished task ID 341
>> 14/07/31 15:07:59 INFO CoarseGrainedExecutorBackend: Got assigned task 377
>> 14/07/31 15:07:59 INFO Executor: Running task ID 377
>> 14/07/31 15:07:59 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:07:59 INFO MapOutputTrackerWorker: Updating epoch to 2 and clearing cache
>> 14/07/31 15:07:59 INFO MapOutputTrackerWorker: Don't have map outputs for shuffle
1, fetching them
>> 14/07/31 15:07:59 INFO MapOutputTrackerWorker: Doing the fetch; tracker actor = Actor[akka.tcp://spark@tdw-10-196-135-106:38502/user/MapOutputTracker#-128956169]
>> 14/07/31 15:07:59 INFO MapOutputTrackerWorker: Got the output locations
>> 14/07/31 15:07:59 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight:
50331648, targetRequestSize: 10066329
>> 14/07/31 15:07:59 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 100
non-empty blocks out of 100 blocks
>> 14/07/31 15:07:59 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 16
remote fetches in 9 ms
>> 14/07/31 15:08:00 INFO CoarseGrainedExecutorBackend: Got assigned task 393
>> 14/07/31 15:08:00 INFO Executor: Running task ID 393
>> 14/07/31 15:08:00 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:08:00 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight:
50331648, targetRequestSize: 10066329
>> 14/07/31 15:08:00 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 100
non-empty blocks out of 100 blocks
>> 14/07/31 15:08:00 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 16
remote fetches in 8 ms
>> 14/07/31 15:08:00 INFO Executor: Serialized size of result for 377 is 303256
>> 14/07/31 15:08:00 INFO Executor: Sending result for 377 directly to driver
>> 14/07/31 15:08:00 INFO Executor: Finished task ID 377
>> 14/07/31 15:08:00 INFO Executor: Serialized size of result for 393 is 310660
>> 14/07/31 15:08:00 INFO Executor: Sending result for 393 directly to driver
>> 14/07/31 15:08:00 INFO Executor: Finished task ID 393
>> 14/07/31 15:08:01 INFO CoarseGrainedExecutorBackend: Got assigned task 403
>> 14/07/31 15:08:01 INFO Executor: Running task ID 403
>> 14/07/31 15:08:01 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:08:01 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight:
50331648, targetRequestSize: 10066329
>> 14/07/31 15:08:01 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 100
non-empty blocks out of 100 blocks
>> 14/07/31 15:08:01 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 16
remote fetches in 7 ms
>> 14/07/31 15:08:01 INFO Executor: Serialized size of result for 403 is 299667
>> 14/07/31 15:08:01 INFO Executor: Sending result for 403 directly to driver
>> 14/07/31 15:08:01 INFO Executor: Finished task ID 403
>> 14/07/31 15:08:02 INFO CoarseGrainedExecutorBackend: Got assigned task 412
>> 14/07/31 15:08:02 INFO Executor: Running task ID 412
>> 14/07/31 15:08:02 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:08:02 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight:
50331648, targetRequestSize: 10066329
>> 14/07/31 15:08:02 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 100
non-empty blocks out of 100 blocks
>> 14/07/31 15:08:02 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 16
remote fetches in 6 ms
>> 14/07/31 15:08:02 INFO Executor: Serialized size of result for 412 is 301593
>> 14/07/31 15:08:02 INFO Executor: Sending result for 412 directly to driver
>> 14/07/31 15:08:02 INFO Executor: Finished task ID 412
>> 14/07/31 15:08:04 INFO CoarseGrainedExecutorBackend: Got assigned task 437
>> 14/07/31 15:08:04 INFO Executor: Running task ID 437
>> 14/07/31 15:08:04 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:08:04 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight:
50331648, targetRequestSize: 10066329
>> 14/07/31 15:08:04 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 100
non-empty blocks out of 100 blocks
>> 14/07/31 15:08:04 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 16
remote fetches in 6 ms
>> 14/07/31 15:08:04 INFO Executor: Serialized size of result for 437 is 312543
>> 14/07/31 15:08:04 INFO Executor: Sending result for 437 directly to driver
>> 14/07/31 15:08:04 INFO Executor: Finished task ID 437
>> 14/07/31 15:08:04 INFO CoarseGrainedExecutorBackend: Got assigned task 445
>> 14/07/31 15:08:04 INFO Executor: Running task ID 445
>> 14/07/31 15:08:04 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:08:04 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight:
50331648, targetRequestSize: 10066329
>> 14/07/31 15:08:04 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 100
non-empty blocks out of 100 blocks
>> 14/07/31 15:08:04 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 16
remote fetches in 6 ms
>> 14/07/31 15:08:04 INFO Executor: Serialized size of result for 445 is 307049
>> 14/07/31 15:08:04 INFO Executor: Sending result for 445 directly to driver
>> 14/07/31 15:08:04 INFO Executor: Finished task ID 445
>> 14/07/31 15:08:06 INFO CoarseGrainedExecutorBackend: Got assigned task 467
>> 14/07/31 15:08:06 INFO Executor: Running task ID 467
>> 14/07/31 15:08:06 INFO BlockManager: Found block broadcast_0 locally
>> 14/07/31 15:08:06 INFO BlockFetcherIterator$BasicBlockFetcherIterator: maxBytesInFlight:
50331648, targetRequestSize: 10066329
>> 14/07/31 15:08:06 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Getting 100
non-empty blocks out of 100 blocks
>> 14/07/31 15:08:06 INFO BlockFetcherIterator$BasicBlockFetcherIterator: Started 16
remote fetches in 6 ms
>> 14/07/31 15:08:07 INFO Executor: Serialized size of result for 467 is 301177
>> 14/07/31 15:08:07 INFO Executor: Sending result for 467 directly to driver
>> 14/07/31 15:08:07 INFO Executor: Finished task ID 467
>> 14/07/31 15:08:18 INFO ShuffleBlockManager: Deleted all files for shuffle 1
>> 14/07/31 15:09:00 WARN BlockManagerMaster: Error sending message to BlockManagerMaster
in 1 attempts*java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
>> 	at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>> 	at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>> 	at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>> 	at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>> 	at scala.concurrent.Await$.result(package.scala:107)
>> 	at org.apache.spark.storage.BlockManagerMaster.askDriverWithReply(BlockManagerMaster.scala:237)
>> 	at org.apache.spark.storage.BlockManagerMaster.sendHeartBeat(BlockManagerMaster.scala:51)
>> 	at org.apache.spark.storage.BlockManager.org <http://org.apache.spark.storage.BlockManager.org>$apache$spark$storage$BlockManager$$heartBeat(BlockManager.scala:113)
>> 	at org.apache.spark.storage.BlockManager$$anonfun$initialize$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(BlockManager.scala:158)
>> 	at org.apache.spark.util.Utils$.tryOrExit(Utils.scala:790)
>> 	at org.apache.spark.storage.BlockManager$$anonfun$initialize$1.apply$mcV$sp(BlockManager.scala:158)
>> 	at akka.actor.Scheduler$$anon$9.run(Scheduler.scala:80)
>> 	at akka.actor.LightArrayRevolverScheduler$$anon$3$$anon$2.run(Scheduler.scala:241)
>> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> 	at java.lang.Thread.run(Thread.java:744)*
>> 14/07/31 15:09:12 INFO ShuffleBlockManager: Deleted all files for shuffle 0
>> 14/07/31 15:09:12 INFO BlockManager: Removing RDD 6
>> 14/07/31 15:09:12 INFO BlockManager: Removing block rdd_6_88
>> 14/07/31 15:09:12 INFO MemoryStore: Block rdd_6_88 of size 1921571 dropped from memory
(free 12338538050)
>> 14/07/31 15:09:12 INFO BlockManager: Removing block rdd_6_25
>> 14/07/31 15:09:12 INFO MemoryStore: Block rdd_6_25 of size 1922882 dropped from memory
(free 12340460932)
>> 14/07/31 15:09:12 INFO BlockManager: Removing block rdd_6_10
>> 14/07/31 15:09:12 INFO MemoryStore: Block rdd_6_10 of size 1912396 dropped from memory
(free 12342373328)
>> 14/07/31 15:09:12 INFO BlockManager: Removing block rdd_6_54
>> 14/07/31 15:09:12 INFO MemoryStore: Block rdd_6_54 of size 1909775 dropped from memory
(free 12344283103)
>> 14/07/31 15:09:12 INFO BlockManager: Removing block rdd_6_80
>> 14/07/31 15:09:12 INFO MemoryStore: Block rdd_6_80 of size 1927469 dropped from memory
(free 12346210572)
>> 14/07/31 15:09:12 INFO BlockManager: Removing block rdd_6_48
>> 14/07/31 15:09:12 INFO MemoryStore: Block rdd_6_48 of size 1927469 dropped from memory
(free 12348138041)
>>
>>
>>
>>
>
>
>

Mime
View raw message