spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 吴晓菊 <chrysan...@gmail.com>
Subject Unable to acquire N bytes of memory, got 0
Date Sun, 01 Jul 2018 12:26:56 GMT
Is it normal to get exception like : "Previous exception in task: Unable to
acquire 65536 bytes of memory, got 0"

In my understanding, in current memory management, no enough memory will
anyway trigger spill so such kind of exception will not be thrown. Unless
some operators are not implemented with spill so many objects remained in
memory.

Please correct me if I'm wrong.

Here pasted the stack trace:
Previous exception in task: Unable to acquire 65536 bytes of memory, got 0
  org.apache.spark.memory.MemoryConsumer.throwOom(MemoryConsumer.java:157)

org.apache.spark.memory.MemoryConsumer.allocateArray(MemoryConsumer.java:98)

org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorter.reset(UnsafeInMemorySorter.java:186)

org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.spill(UnsafeExternalSorter.java:229)

org.apache.spark.memory.TaskMemoryManager.acquireExecutionMemory(TaskMemoryManager.java:204)

org.apache.spark.memory.TaskMemoryManager.allocatePage(TaskMemoryManager.java:283)

org.apache.spark.memory.MemoryConsumer.allocateArray(MemoryConsumer.java:96)

org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.growPointerArrayIfNecessary(UnsafeExternalSorter.java:348)

org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.insertRecord(UnsafeExternalSorter.java:403)

org.apache.spark.sql.execution.UnsafeExternalRowSorter.insertRow(UnsafeExternalRowSorter.java:135)

org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage6.sort_addToSorter$(generated.java:32)

org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage6.processNext(generated.java:41)

org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)

org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage10.findNextInnerJoinRows$(generated.java:407)

org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage10.agg_doAggregateWithKeys$(generated.java:199)

org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage10.processNext(generated.java:695)

org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)

org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$12$$anon$2.hasNext(WholeStageCodegenExec.scala:633)
  scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)

org.apache.spark.shuffle.sort.UnsafeShuffleWriter.write(UnsafeShuffleWriter.java:187)
  org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
  org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
  org.apache.spark.scheduler.Task.run(Task.scala:109)
  org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)

java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  java.lang.Thread.run(Thread.java:745)
  at
org.apache.spark.TaskContextImpl.invokeListeners(TaskContextImpl.scala:139)
  at
org.apache.spark.TaskContextImpl.markTaskCompleted(TaskContextImpl.scala:117)
  at org.apache.spark.scheduler.Task.run(Task.scala:119)
  at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
  at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  at java.lang.Thread.run(Thread.java:745)


Chrysan Wu
Phone:+86 17717640807

Mime
View raw message