spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chan Chor Pang <chin...@indetail.co.jp>
Subject Re: java.lang.OutOfMemoryError: unable to create new native thread
Date Mon, 31 Oct 2016 00:22:11 GMT
you may want to check the process limit of the user who responsible for 
starting the JVM.
/etc/security/limits.d/90-nproc.conf


On 10/29/16 4:47 AM, kant kodali wrote:
>  "dag-scheduler-event-loop" java.lang.OutOfMemoryError: unable to 
> create new native thread
>         at java.lang.Thread.start0(Native Method)
>         at java.lang.Thread.start(Thread.java:714)
>         at 
> scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(ForkJoinPool.java:1672)
>         at 
> scala.concurrent.forkjoin.ForkJoinPool.signalWork(ForkJoinPool.java:1966)
>         at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.push(ForkJoinPool.java:1072)
>         at 
> scala.concurrent.forkjoin.ForkJoinTask.fork(ForkJoinTask.java:654)
>         at scala.collection.parallel.ForkJoinTasks$WrappedTask$
>
> This is the error produced by the Spark Driver program which is 
> running on client mode by default so some people say just increase the 
> heap size by passing the --driver-memory 3g flag however the message 
> *"**unable to create new native thread**"*  really says that the JVM 
> is asking OS to create a new thread but OS couldn't allocate it 
> anymore and the number of threads a JVM can create by requesting OS is 
> platform dependent but typically it is 32K threads on a 64-bit JVM. so 
> I am wondering why spark is even creating so many threads and how do I 
> control this number?


Mime
View raw message