spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Kevin (Sangwoo) Kim" <kevin...@apache.org>
Subject Re: Futures timed out during unpersist
Date Sat, 17 Jan 2015 13:04:28 GMT
data size is about 300~400GB, I'm using 800GB cluster and set driver memory
to 50GB.

On Sat Jan 17 2015 at 6:01:46 PM Akhil Das <akhil@sigmoidanalytics.com>
wrote:

> What is the data size? Have you tried increasing the driver memory??
>
> Thanks
> Best Regards
>
> On Sat, Jan 17, 2015 at 1:01 PM, Kevin (Sangwoo) Kim <kevinkim@apache.org>
> wrote:
>
>> Hi experts,
>> I got an error during unpersist RDD.
>> Any ideas?
>>
>> java.util.concurrent.TimeoutException: Futures timed out after [30
>> seconds] at
>> scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) at
>> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at
>> scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at
>> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>> at scala.concurrent.Await$.result(package.scala:107) at
>> org.apache.spark.storage.BlockManagerMaster.removeRdd(BlockManagerMaster.scala:103)
>> at org.apache.spark.SparkContext.unpersistRDD(SparkContext.scala:951) at
>> org.apache.spark.rdd.RDD.unpersist(RDD.scala:168)
>>
>>
>

Mime
View raw message