spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sea" <261810...@qq.com>
Subject 回复: mapwithstate Hangs with Error cleaning broadcast
Date Tue, 15 Mar 2016 17:36:59 GMT
Hi,manas:
     Maybe you can look at this bug: https://issues.apache.org/jira/browse/SPARK-13566






------------------ 原始邮件 ------------------
发件人: "manas kar";<poorinspark@gmail.com>;
发送时间: 2016年3月15日(星期二) 晚上10:48
收件人: "Ted Yu"<yuzhihong@gmail.com>; 
抄送: "user"<user@spark.apache.org>; 
主题: Re: mapwithstate Hangs with Error cleaning broadcast



I am using spark 1.6.I am not using any broadcast variable.
This broadcast variable is probably used by the state management of mapwithState


...Manas


On Tue, Mar 15, 2016 at 10:40 AM, Ted Yu <yuzhihong@gmail.com> wrote:
Which version of Spark are you using ?

Can you show the code snippet w.r.t. broadcast variable ?


Thanks


On Tue, Mar 15, 2016 at 6:04 AM, manasdebashiskar <poorinspark@gmail.com> wrote:
Hi,
  I have a streaming application that takes data from a kafka topic and uses
 mapwithstate.
  After couple of hours of smooth running of the application I see a problem
 that seems to have stalled my application.
 The batch seems to have been stuck after the following error popped up.
 Has anyone seen this error or know what causes it?
 14/03/2016 21:41:13,295 ERROR org.apache.spark.ContextCleaner: 95 - Error
 cleaning broadcast 7456
 org.apache.spark.rpc.RpcTimeoutException: Futures timed out after [120
 seconds]. This timeout is controlled by spark.rpc.askTimeout
         at
 org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:48)
         at
 org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:63)
         at
 org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
         at
 scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
         at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:76)
         at
 org.apache.spark.storage.BlockManagerMaster.removeBroadcast(BlockManagerMaster.scala:136)
         at
 org.apache.spark.broadcast.TorrentBroadcast$.unpersist(TorrentBroadcast.scala:228)
         at
 org.apache.spark.broadcast.TorrentBroadcastFactory.unbroadcast(TorrentBroadcastFactory.scala:45)
         at
 org.apache.spark.broadcast.BroadcastManager.unbroadcast(BroadcastManager.scala:67)
         at
 org.apache.spark.ContextCleaner.doCleanupBroadcast(ContextCleaner.scala:233)
         at
 org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1$$anonfun$apply$mcV$sp$2.apply(ContextCleaner.scala:189)
         at
 org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1$$anonfun$apply$mcV$sp$2.apply(ContextCleaner.scala:180)
         at scala.Option.foreach(Option.scala:236)
         at
 org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:180)
         at
 org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1180)
         at
 org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:173)
         at
 org.apache.spark.ContextCleaner$$anon$3.run(ContextCleaner.scala:68)
 Caused by: java.util.concurrent.TimeoutException: Futures timed out after
 [120 seconds]
         at
 scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
         at
 scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
         at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
         at
 scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
         at scala.concurrent.Await$.result(package.scala:107)
         at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
         ... 12 more
 
 
 
 
 --
 View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/mapwithstate-Hangs-with-Error-cleaning-broadcast-tp26500.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.
 
 ---------------------------------------------------------------------
 To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
 For additional commands, e-mail: user-help@spark.apache.org
Mime
View raw message