spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chan Chor Pang <>
Subject Re: java.lang.OutOfMemoryError: unable to create new native thread
Date Mon, 31 Oct 2016 00:22:11 GMT
you may want to check the process limit of the user who responsible for 
starting the JVM.

On 10/29/16 4:47 AM, kant kodali wrote:
>  "dag-scheduler-event-loop" java.lang.OutOfMemoryError: unable to 
> create new native thread
>         at java.lang.Thread.start0(Native Method)
>         at java.lang.Thread.start(
>         at 
> scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(
>         at 
> scala.concurrent.forkjoin.ForkJoinPool.signalWork(
>         at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.push(
>         at 
> scala.concurrent.forkjoin.ForkJoinTask.fork(
>         at scala.collection.parallel.ForkJoinTasks$WrappedTask$
> This is the error produced by the Spark Driver program which is 
> running on client mode by default so some people say just increase the 
> heap size by passing the --driver-memory 3g flag however the message 
> *"**unable to create new native thread**"*  really says that the JVM 
> is asking OS to create a new thread but OS couldn't allocate it 
> anymore and the number of threads a JVM can create by requesting OS is 
> platform dependent but typically it is 32K threads on a 64-bit JVM. so 
> I am wondering why spark is even creating so many threads and how do I 
> control this number?

View raw message