spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Xiangrui Meng (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-5516) ActorSystemImpl: Uncaught fatal error from thread [sparkDriver-akka.actor.default-dispatcher-22] shutting down ActorSystem [sparkDriver] java.lang.OutOfMemoryError: Java heap space
Date Fri, 20 Feb 2015 23:03:12 GMT

    [ https://issues.apache.org/jira/browse/SPARK-5516?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14329718#comment-14329718
] 

Xiangrui Meng commented on SPARK-5516:
--------------------------------------

[~wuyukai] Could you provide all the parameters you used? The most important ones are number
of features, maxDepth, and maxBins. Please also remember to set `--driver-memory` to a large
number with spark-submit. 

> ActorSystemImpl: Uncaught fatal error from thread [sparkDriver-akka.actor.default-dispatcher-22]
shutting down ActorSystem [sparkDriver] java.lang.OutOfMemoryError: Java heap space
> ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-5516
>                 URL: https://issues.apache.org/jira/browse/SPARK-5516
>             Project: Spark
>          Issue Type: Bug
>          Components: MLlib
>    Affects Versions: 1.2.0
>         Environment: centos 6.5   
>            Reporter: wuyukai
>
> When we ran the model of Gradient Boosting Tree, it throwed this exception below. The
data we used is only 45M. We ran these data on 4 computers that each have 4 cores and 16GB
RAM. We set the parameter "gradientboostedtrees.maxiteration" 50.
> 15/02/01 01:39:48 INFO DAGScheduler: Job 965 failed: collectAsMap at DecisionTree.scala:653,
took 1.616976 s
> Exception in thread "main" org.apache.spark.SparkException: Job cancelled because SparkContext
was shut down
> 	at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:702)
> 	at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:701)
> 	at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
> 	at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:701)
> 	at org.apache.spark.scheduler.DAGSchedulerEventProcessActor.postStop(DAGScheduler.scala:1428)
> 	at akka.actor.Actor$class.aroundPostStop(Actor.scala:475)
> 	at org.apache.spark.scheduler.DAGSchedulerEventProcessActor.aroundPostStop(DAGScheduler.scala:1375)
> 	at akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:210)
> 	at akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:172)
> 	at akka.actor.ActorCell.terminate(ActorCell.scala:369)
> 	at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:462)
> 	at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
> 	at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
> 	at akka.dispatch.Mailbox.run(Mailbox.scala:219)
> 	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
> 	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> 	at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> 	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> 	at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> 15/02/01 01:39:48 ERROR ActorSystemImpl: Uncaught fatal error from thread [sparkDriver-akka.actor.default-dispatcher-22]
shutting down ActorSystem [sparkDriver]
> java.lang.OutOfMemoryError: Java heap space
> 	at java.util.Arrays.copyOf(Arrays.java:2271)
> 	at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:113)
> 	at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93)
> 	at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:140)
> 	at java.io.ObjectOutputStream$BlockDataOutputStream.drain(ObjectOutputStream.java:1876)
> 	at java.io.ObjectOutputStream$BlockDataOutputStream.setBlockDataMode(ObjectOutputStream.java:1785)
> 	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1188)
> 	at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347)
> 	at scala.collection.immutable.$colon$colon.writeObject(List.scala:379)
> 	at sun.reflect.GeneratedMethodAccessor11.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
> 	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1495)
> 	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
> 	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
> 	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
> 	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
> 	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
> 	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
> 	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
> 	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
> 	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
> 	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
> 	at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347)
> 	at scala.collection.immutable.$colon$colon.writeObject(List.scala:379)
> 	at sun.reflect.GeneratedMethodAccessor11.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message