spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kürşat Kurt <kur...@kursatkurt.com>
Subject Out Of Memory issue
Date Sat, 29 Oct 2016 20:51:08 GMT
Hi;

 

While training NaiveBayes classification, i am getting OOM.

What is wrong with these parameters?

Here is the spark-submit command: ./spark-submit --class main.scala.Test1
--master local[*] --driver-memory 60g  /home/user1/project_2.11-1.0.jar

 

Ps: Os is Ubuntu 14.04 and system has 64GB RAM, 256GB SSD with spark 2.0.1. 

 

16/10/29 23:32:21 INFO BlockManagerInfo: Removed broadcast_10_piece0 on
89.*************:35416 in memory (size: 4.0 MB, free: 31.7 GB)

16/10/29 23:32:21 INFO BlockManagerInfo: Removed broadcast_10_piece1 on
89.*************:35416 in memory (size: 2.4 MB, free: 31.7 GB)

16/10/29 23:33:00 INFO ExternalAppendOnlyMap: Thread 123 spilling in-memory
map of 31.8 GB to disk (1 time so far)

16/10/29 23:34:42 INFO ExternalAppendOnlyMap: Thread 123 spilling in-memory
map of 31.8 GB to disk (2 times so far)

16/10/29 23:36:58 INFO ExternalAppendOnlyMap: Thread 123 spilling in-memory
map of 31.8 GB to disk (3 times so far)

16/10/29 23:41:27 WARN TaskMemoryManager: leak 21.2 GB memory from
org.apache.spark.util.collection.ExternalAppendOnlyMap@43ab2e76

16/10/29 23:41:28 ERROR Executor: Exception in task 0.0 in stage 10.0 (TID
31)

java.lang.OutOfMemoryError: Java heap space

        at com.esotericsoftware.kryo.io.Input.readDoubles(Input.java:885)

        at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$DoubleArraySer
ializer.read(DefaultArraySerializers.java:222)

        at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$DoubleArraySer
ializer.read(DefaultArraySerializers.java:205)

        at com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:759)

        at
com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:132)

        at
com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.j
ava:551)

        at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)

        at
com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42)

        at
com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33)

        at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)

        at
org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSeriali
zer.scala:229)

        at
org.apache.spark.serializer.DeserializationStream.readValue(Serializer.scala
:159)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$DiskMapIterator.readN
extItem(ExternalAppendOnlyMap.scala:515)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$DiskMapIterator.hasNe
xt(ExternalAppendOnlyMap.scala:535)

        at scala.collection.Iterator$$anon$1.hasNext(Iterator.scala:1004)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.org$
apache$spark$util$collection$ExternalAppendOnlyMap$ExternalIterator$$readNex
tHashCode(ExternalAppendOnlyMap.scala:336)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator$$ano
nfun$next$1.apply(ExternalAppendOnlyMap.scala:409)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator$$ano
nfun$next$1.apply(ExternalAppendOnlyMap.scala:407)

        at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:5
9)

        at
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.next
(ExternalAppendOnlyMap.scala:407)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.next
(ExternalAppendOnlyMap.scala:302)

        at scala.collection.Iterator$class.foreach(Iterator.scala:893)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.fore
ach(ExternalAppendOnlyMap.scala:302)

        at
scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)

        at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)

        at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)

        at
scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.to(E
xternalAppendOnlyMap.scala:302)

        at
scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.toBu
ffer(ExternalAppendOnlyMap.scala:302)

        at
scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)

16/10/29 23:41:28 ERROR SparkUncaughtExceptionHandler: Uncaught exception in
thread Thread[Executor task launch worker-7,5,main]

java.lang.OutOfMemoryError: Java heap space

        at com.esotericsoftware.kryo.io.Input.readDoubles(Input.java:885)

        at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$DoubleArraySer
ializer.read(DefaultArraySerializers.java:222)

        at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$DoubleArraySer
ializer.read(DefaultArraySerializers.java:205)

        at com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:759)

        at
com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:132)

        at
com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.j
ava:551)

        at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)

        at
com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42)

        at
com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33)

        at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)

        at
org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSeriali
zer.scala:229)

        at
org.apache.spark.serializer.DeserializationStream.readValue(Serializer.scala
:159)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$DiskMapIterator.readN
extItem(ExternalAppendOnlyMap.scala:515)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$DiskMapIterator.hasNe
xt(ExternalAppendOnlyMap.scala:535)

        at scala.collection.Iterator$$anon$1.hasNext(Iterator.scala:1004)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.org$
apache$spark$util$collection$ExternalAppendOnlyMap$ExternalIterator$$readNex
tHashCode(ExternalAppendOnlyMap.scala:336)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator$$ano
nfun$next$1.apply(ExternalAppendOnlyMap.scala:409)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator$$ano
nfun$next$1.apply(ExternalAppendOnlyMap.scala:407)

        at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:5
9)

        at
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.next
(ExternalAppendOnlyMap.scala:407)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.next
(ExternalAppendOnlyMap.scala:302)

        at scala.collection.Iterator$class.foreach(Iterator.scala:893)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.fore
ach(ExternalAppendOnlyMap.scala:302)

        at
scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)

        at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)

        at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)

        at
scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.to(E
xternalAppendOnlyMap.scala:302)

        at
scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.toBu
ffer(ExternalAppendOnlyMap.scala:302)

        at
scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)

Oct 29, 2016 11:25:48 PM INFO: org.apache.parquet.hadoop.codec.CodecConfig:
Compression: SNAPPY

Oct 29, 2016 11:25:48 PM INFO:
org.apache.parquet.hadoop.ParquetOutputFormat: Parquet block size to
134217728

Oct 29, 2016 11:25:48 PM INFO:
org.apache.parquet.hadoop.ParquetOutputFormat: Parquet page size to 1048576

Oct 29, 2016 11:25:48 PM INFO:
org.apache.parquet.hadoop.ParquetOutputFormat: Parquet dictionary page size
to 1048576

Oct 29, 2016 11:25:48 PM INFO:
org.apache.parquet.hadoop.ParquetOutputFormat: Dictionary is on

Oct 29, 2016 11:25:48 PM INFO:
org.apache.parquet.hadoop.ParquetOutputFormat: Validation is off

Oct 29, 2016 11:25:48 PM INFO:
org.apache.parquet.hadoop.ParquetOutputFormat: Writer version is:
PARQUET_1_0

Oct 29, 2016 11:25:49 PM INFO:
org.apache.parquet.hadoop.InternalParquetRecordWriter: Flushing mem
columnStore to file. allocated memory: 4,396,549

Oct 29, 2016 11:25:49 PM INFO:
org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 4,157,541B for
[labels, list, element] BINARY: 142,207 values, 5,600,131B raw, 4,156,878B
comp, 6 pages, encodings: [PLAIN, RLE]

16/10/29 23:41:28 WARN TaskSetManager: Lost task 0.0 in stage 10.0 (TID 31,
localhost): java.lang.OutOfMemoryError: Java heap space

        at com.esotericsoftware.kryo.io.Input.readDoubles(Input.java:885)

        at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$DoubleArraySer
ializer.read(DefaultArraySerializers.java:222)

        at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$DoubleArraySer
ializer.read(DefaultArraySerializers.java:205)

        at com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:759)

        at
com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:132)

        at
com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.j
ava:551)

        at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)

        at
com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42)

        at
com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33)

        at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)

        at
org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSeriali
zer.scala:229)

        at
org.apache.spark.serializer.DeserializationStream.readValue(Serializer.scala
:159)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$DiskMapIterator.readN
extItem(ExternalAppendOnlyMap.scala:515)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$DiskMapIterator.hasNe
xt(ExternalAppendOnlyMap.scala:535)

        at scala.collection.Iterator$$anon$1.hasNext(Iterator.scala:1004)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.org$
apache$spark$util$collection$ExternalAppendOnlyMap$ExternalIterator$$readNex
tHashCode(ExternalAppendOnlyMap.scala:336)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator$$ano
nfun$next$1.apply(ExternalAppendOnlyMap.scala:409)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator$$ano
nfun$next$1.apply(ExternalAppendOnlyMap.scala:407)

        at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:5
9)

        at
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.next
(ExternalAppendOnlyMap.scala:407)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.next
(ExternalAppendOnlyMap.scala:302)

        at scala.collection.Iterator$class.foreach(Iterator.scala:893)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.fore
ach(ExternalAppendOnlyMap.scala:302)

        at
scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)

        at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)

        at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)

        at
scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.to(E
xternalAppendOnlyMap.scala:302)

        at
scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.toBu
ffer(ExternalAppendOnlyMap.scala:302)

        at
scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)

 

16/10/29 23:41:28 INFO SparkContext: Invoking stop() from shutdown hook

16/10/29 23:41:28 ERROR TaskSetManager: Task 0 in stage 10.0 failed 1 times;
aborting job

16/10/29 23:41:28 INFO TaskSchedulerImpl: Removed TaskSet 10.0, whose tasks
have all completed, from pool

16/10/29 23:41:28 INFO TaskSchedulerImpl: Cancelling stage 10

16/10/29 23:41:28 INFO DAGScheduler: ResultStage 10 (collect at
NaiveBayes.scala:400) failed in 570.233 s

16/10/29 23:41:28 INFO DAGScheduler: Job 5 failed: collect at
NaiveBayes.scala:400, took 934.966523 s

Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: Task 0 in stage 10.0 failed 1 times, most recent failure:
Lost task 0.0 in stage 10.0 (TID 31, localhost): java.lang.OutOfMemoryError:
Java heap space

       at com.esotericsoftware.kryo.io.Input.readDoubles(Input.java:885)

        at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$DoubleArraySer
ializer.read(DefaultArraySerializers.java:222)

        at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$DoubleArraySer
ializer.read(DefaultArraySerializers.java:205)

        at com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:759)

        at
com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:132)

        at
com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.j
ava:551)

        at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)

        at
com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42)

        at
com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33)

        at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)

        at
org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSeriali
zer.scala:229)

        at
org.apache.spark.serializer.DeserializationStream.readValue(Serializer.scala
:159)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$DiskMapIterator.readN
extItem(ExternalAppendOnlyMap.scala:515)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$DiskMapIterator.hasNe
xt(ExternalAppendOnlyMap.scala:535)

        at scala.collection.Iterator$$anon$1.hasNext(Iterator.scala:1004)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.org$
apache$spark$util$collection$ExternalAppendOnlyMap$ExternalIterator$$readNex
tHashCode(ExternalAppendOnlyMap.scala:336)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator$$ano
nfun$next$1.apply(ExternalAppendOnlyMap.scala:409)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator$$ano
nfun$next$1.apply(ExternalAppendOnlyMap.scala:407)

        at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:5
9)

        at
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.next
(ExternalAppendOnlyMap.scala:407)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.next
(ExternalAppendOnlyMap.scala:302)

        at scala.collection.Iterator$class.foreach(Iterator.scala:893)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.fore
ach(ExternalAppendOnlyMap.scala:302)

        at
scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)

        at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)

        at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)

        at
scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.to(E
xternalAppendOnlyMap.scala:302)

        at
scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.toBu
ffer(ExternalAppendOnlyMap.scala:302)

        at
scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)

 

Driver stacktrace:

        at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGSchedu
ler$$failJobAndIndependentStages(DAGScheduler.scala:1454)

        at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGSched
uler.scala:1442)

        at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGSched
uler.scala:1441)

        at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:5
9)

        at
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)

        at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1441)

        at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply
(DAGScheduler.scala:811)

        at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply
(DAGScheduler.scala:811)

        at scala.Option.foreach(Option.scala:257)

        at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.sca
la:811)

        at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGSched
uler.scala:1667)

        at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGSchedul
er.scala:1622)

        at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGSchedul
er.scala:1611)

        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

        at
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:632)

        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1890)

        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1903)

        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1916)

        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1930)

        at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:912)

        at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:15
1)

        at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:11
2)

        at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)

        at org.apache.spark.rdd.RDD.collect(RDD.scala:911)

        at
org.apache.spark.mllib.classification.NaiveBayes.run(NaiveBayes.scala:400)

        at
org.apache.spark.mllib.classification.NaiveBayes$.train(NaiveBayes.scala:507
)

        at
org.apache.spark.ml.classification.NaiveBayes.train(NaiveBayes.scala:114)

        at
org.apache.spark.ml.classification.NaiveBayes.train(NaiveBayes.scala:76)

        at org.apache.spark.ml.Predictor.fit(Predictor.scala:90)

        at org.apache.spark.ml.Predictor.fit(Predictor.scala:71)

        at
org.apache.spark.ml.Pipeline$$anonfun$fit$2.apply(Pipeline.scala:149)

        at
org.apache.spark.ml.Pipeline$$anonfun$fit$2.apply(Pipeline.scala:145)

        at scala.collection.Iterator$class.foreach(Iterator.scala:893)

        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)

        at
scala.collection.IterableViewLike$Transformed$class.foreach(IterableViewLike
.scala:44)

        at
scala.collection.SeqViewLike$AbstractTransformed.foreach(SeqViewLike.scala:3
7)

        at org.apache.spark.ml.Pipeline.fit(Pipeline.scala:145)

        at main.scala.Test1$.main(Test1.scala:172)

        at main.scala.Test1.main(Test1.scala)

        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62
)

        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
.java:43)

        at java.lang.reflect.Method.invoke(Method.java:498)

        at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$ru
nMain(SparkSubmit.scala:736)

        at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)

        at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)

        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)

        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Caused by: java.lang.OutOfMemoryError: Java heap space

        at com.esotericsoftware.kryo.io.Input.readDoubles(Input.java:885)

        at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$DoubleArraySer
ializer.read(DefaultArraySerializers.java:222)

        at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$DoubleArraySer
ializer.read(DefaultArraySerializers.java:205)

        at com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:759)

        at
com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:132)

        at
com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.j
ava:551)

        at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)

        at
com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42)

        at
com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33)

        at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)

        at
org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSeriali
zer.scala:229)

        at
org.apache.spark.serializer.DeserializationStream.readValue(Serializer.scala
:159)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$DiskMapIterator.readN
extItem(ExternalAppendOnlyMap.scala:515)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$DiskMapIterator.hasNe
xt(ExternalAppendOnlyMap.scala:535)

        at scala.collection.Iterator$$anon$1.hasNext(Iterator.scala:1004)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.org$
apache$spark$util$collection$ExternalAppendOnlyMap$ExternalIterator$$readNex
tHashCode(ExternalAppendOnlyMap.scala:336)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator$$ano
nfun$next$1.apply(ExternalAppendOnlyMap.scala:409)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator$$ano
nfun$next$1.apply(ExternalAppendOnlyMap.scala:407)

        at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:5
9)

        at
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.next
(ExternalAppendOnlyMap.scala:407)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.next
(ExternalAppendOnlyMap.scala:302)

        at scala.collection.Iterator$class.foreach(Iterator.scala:893)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.fore
ach(ExternalAppendOnlyMap.scala:302)

        at
scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)

        at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)

        at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)

        at
scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.to(E
xternalAppendOnlyMap.scala:302)

        at
scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)

        at
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.toBu
ffer(ExternalAppendOnlyMap.scala:302)

        at
scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)

16/10/29 23:41:28 INFO SparkUI: Stopped Spark web UI at
http://89.*************:4040

16/10/29 23:41:28 INFO MapOutputTrackerMasterEndpoint:
MapOutputTrackerMasterEndpoint stopped!

16/10/29 23:41:28 INFO MemoryStore: MemoryStore cleared

16/10/29 23:41:28 INFO BlockManager: BlockManager stopped

16/10/29 23:41:28 INFO BlockManagerMaster: BlockManagerMaster stopped

16/10/29 23:41:28 INFO
OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
OutputCommitCoordinator stopped!

16/10/29 23:41:28 INFO SparkContext: Successfully stopped SparkContext

16/10/29 23:41:28 INFO ShutdownHookManager: Shutdown hook called

16/10/29 23:41:28 INFO ShutdownHookManager: Deleting directory
/tmp/spark-15cf14e4-f103-4cbf-aa0f-85828eadbcce


Mime
View raw message