spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: Master dies after program finishes normally
Date Mon, 29 Jun 2015 06:40:45 GMT
Which version of spark are you using? You can try changing the heap size
manually by *export _JAVA_OPTIONS="-Xmx5g" *

Thanks
Best Regards

On Fri, Jun 26, 2015 at 7:52 PM, Yifan LI <iamyifanli@gmail.com> wrote:

> Hi,
>
> I just encountered the same problem, when I run a PageRank program which
> has lots of stages(iterations)…
>
> The master was lost after my program done.
>
> And, the issue still remains even I increased driver memory.
>
>
> Have any idea? e.g. how to increase the master memory?
> Thanks.
>
>
>
> Best,
> Yifan LI
>
>
>
>
>
> On 12 Feb 2015, at 20:05, Arush Kharbanda <arush@sigmoidanalytics.com>
> wrote:
>
> What is your cluster configuration? Did you try looking at the Web UI?
> There are many tips here
>
> http://spark.apache.org/docs/1.2.0/tuning.html
>
> Did you try these?
>
> On Fri, Feb 13, 2015 at 12:09 AM, Manas Kar <manasdebashiskar@gmail.com>
> wrote:
>
>> Hi,
>>  I have a Hidden Markov Model running with 200MB data.
>>  Once the program finishes (i.e. all stages/jobs are done) the program
>> hangs for 20 minutes or so before killing master.
>>
>> In the spark master the following log appears.
>>
>> 2015-02-12 13:00:05,035 ERROR akka.actor.ActorSystemImpl: Uncaught fatal
>> error from thread [sparkMaster-akka.actor.default-dispatcher-31] shutting
>> down ActorSystem [sparkMaster]
>> java.lang.OutOfMemoryError: GC overhead limit exceeded
>>         at scala.collection.immutable.List$.newBuilder(List.scala:396)
>>         at
>> scala.collection.generic.GenericTraversableTemplate$class.genericBuilder(GenericTraversableTemplate.scala:69)
>>         at
>> scala.collection.AbstractTraversable.genericBuilder(Traversable.scala:105)
>>         at
>> scala.collection.generic.GenTraversableFactory$GenericCanBuildFrom.apply(GenTraversableFactory.scala:58)
>>         at
>> scala.collection.generic.GenTraversableFactory$GenericCanBuildFrom.apply(GenTraversableFactory.scala:53)
>>         at
>> scala.collection.TraversableLike$class.builder$1(TraversableLike.scala:239)
>>         at
>> scala.collection.TraversableLike$class.map(TraversableLike.scala:243)
>>         at scala.collection.AbstractTraversable.map(Traversable.scala:105)
>>         at
>> org.json4s.MonadicJValue$$anonfun$org$json4s$MonadicJValue$$findDirectByName$1.apply(MonadicJValue.scala:26)
>>         at
>> org.json4s.MonadicJValue$$anonfun$org$json4s$MonadicJValue$$findDirectByName$1.apply(MonadicJValue.scala:22)
>>         at
>> scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
>>         at
>> scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
>>         at scala.collection.immutable.List.foreach(List.scala:318)
>>         at
>> scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:251)
>>         at
>> scala.collection.AbstractTraversable.flatMap(Traversable.scala:105)
>>         at org.json4s.MonadicJValue.org
>> <http://org.json4s.monadicjvalue.org/>
>> $json4s$MonadicJValue$$findDirectByName(MonadicJValue.scala:22)
>>         at org.json4s.MonadicJValue.$bslash(MonadicJValue.scala:16)
>>         at
>> org.apache.spark.util.JsonProtocol$.taskStartFromJson(JsonProtocol.scala:450)
>>         at
>> org.apache.spark.util.JsonProtocol$.sparkEventFromJson(JsonProtocol.scala:423)
>>         at
>> org.apache.spark.scheduler.ReplayListenerBus$$anonfun$replay$2$$anonfun$apply$1.apply(ReplayListenerBus.scala:71)
>>         at
>> org.apache.spark.scheduler.ReplayListenerBus$$anonfun$replay$2$$anonfun$apply$1.apply(ReplayListenerBus.scala:69)
>>         at scala.collection.Iterator$class.foreach(Iterator.scala:727)
>>         at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
>>         at
>> org.apache.spark.scheduler.ReplayListenerBus$$anonfun$replay$2.apply(ReplayListenerBus.scala:69)
>>         at
>> org.apache.spark.scheduler.ReplayListenerBus$$anonfun$replay$2.apply(ReplayListenerBus.scala:55)
>>         at
>> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
>>         at
>> scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
>>         at
>> org.apache.spark.scheduler.ReplayListenerBus.replay(ReplayListenerBus.scala:55)
>>         at
>> org.apache.spark.deploy.master.Master.rebuildSparkUI(Master.scala:726)
>>         at
>> org.apache.spark.deploy.master.Master.removeApplication(Master.scala:675)
>>         at
>> org.apache.spark.deploy.master.Master.finishApplication(Master.scala:653)
>>         at
>> org.apache.spark.deploy.master.Master$$anonfun$receiveWithLogging$1$$anonfun$applyOrElse$29.apply(Master.scala:399)
>>
>> Can anyone help?
>>
>> ..Manas
>>
>
>
>
> --
>
> [image: Sigmoid Analytics] <http://htmlsig.com/www.sigmoidanalytics.com>
>
> *Arush Kharbanda* || Technical Teamlead
>
> arush@sigmoidanalytics.com || www.sigmoidanalytics.com
>
>
>

Mime
View raw message