spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From kant kodali <kanth...@gmail.com>
Subject java.lang.OutOfMemoryError: unable to create new native thread
Date Fri, 28 Oct 2016 19:47:20 GMT
 "dag-scheduler-event-loop" java.lang.OutOfMemoryError: unable to create
new native thread
        at java.lang.Thread.start0(Native Method)
        at java.lang.Thread.start(Thread.java:714)
        at scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(
ForkJoinPool.java:1672)
        at scala.concurrent.forkjoin.ForkJoinPool.signalWork(
ForkJoinPool.java:1966)
        at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.push(
ForkJoinPool.java:1072)
        at scala.concurrent.forkjoin.ForkJoinTask.fork(
ForkJoinTask.java:654)
        at scala.collection.parallel.ForkJoinTasks$WrappedTask$

This is the error produced by the Spark Driver program which is running on
client mode by default so some people say just increase the heap size by
passing the --driver-memory 3g flag however the message *"**unable to
create new native thread**"*  really says that the JVM is asking OS to
create a new thread but OS couldn't allocate it anymore and the number of
threads a JVM can create by requesting OS is platform dependent but
typically it is 32K threads on a 64-bit JVM. so I am wondering why spark is
even creating so many threads and how do I control this number?

Mime
View raw message