I have the same Exception before and the problem fix after i change the nproc conf.

> max user processes              (-u) 120242
↑this config does looks good.
are u sure the user who run ulimit -a is the same user who run the Java process?
depend on how u submit the job and your setting, spark job may execute by other user.

On 10/31/16 10:38 AM, kant kodali wrote:
when I did this 

cat /proc/sys/kernel/pid_max 

I got 32768

On Sun, Oct 30, 2016 at 6:36 PM, kant kodali <kanth909@gmail.com> wrote:
I believe for ubuntu it is unlimited but I am not 100% sure (I just read somewhere online). I ran ulimit -a and this is what I get

core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 120242
max locked memory       (kbytes, -l) 64
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 120242
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited

On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang <chin-sh@indetail.co.jp> wrote:

not sure for ubuntu, but i think you can just create the file by yourself
the syntax will be the same as /etc/security/limits.conf

nproc.conf not only limit java process but all process by the same user

so even the jvm process does nothing,  if the corresponding user is busy in other way
the jvm process will still not able to create new thread.

btw the default limit for centos is 1024

On 10/31/16 9:51 AM, kant kodali wrote:

On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang <chin-sh@indetail.co.jp> wrote:


I am using Ubuntu 16.04 LTS. I have this directory /etc/security/limits.d/ but I don't have any files underneath it. This error happens after running for 4 to 5 hours. I wonder if this is a GC issue? And I am thinking if I should use CMS. I have also posted this on SO since I havent got much response for this question http://stackoverflow.com/questions/40315589/dag-scheduler-event-loop-java-lang-outofmemoryerror-unable-to-create-new-native


陳 楚鵬
E-mail :chin-sh@indetail.co.jp
URL : http://www.indetail.co.jp

TEL:011-206-9235 FAX:011-206-9236

東京都港区芝5丁目29番20号 クロスオフィス三田
TEL:03-6809-6502 FAX:03-6809-6504

愛知県名古屋市中区丸の内3丁目17番24号 NAYUTA BLD