spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mohamed Nadjib MAMI <m...@iai.uni-bonn.de>
Subject Too many open files, why changing ulimit not effecting?
Date Fri, 05 Feb 2016 09:42:34 GMT
Hello all,

I'm getting the famous /java.io.FileNotFoundException: ... (Too many 
open files) /exception. What seemed to have helped people out, it 
haven't for me. I tried to set the ulimit via the command line /"ulimit 
-n"/, then I tried to add the following lines to 
/"/etc/security/limits.conf"/ file:
/
//* - nofile 1000000//
//root soft nofile 1000000//
//root hard nofile 1000000//
//hduser soft nofile 1000000//
//hduser hard nofile 1000000/

...then I added this line /"session required pam_limits.so"/ to the two 
files/"/etc/pam.d/common-session"/ and /"session required 
pam_limits.so"/. The I logged-out/logged-in. First, I tried only the 
first line (/* - nofile 1000000//)/, then added the 2nd and the 3rd 
(root...),  then added the last two lines (hduser...), no effect. 
Weirdly enough, when I check with the command /"ulimit -n"/ it returns 
the correct value of 1000000.

I then added /"ulimit -n 1000000"/ to /"spark-env.sh"/ in the master and 
in each of my workers, no effect.

What else could it be besides changing the ulimit setting? if it's only 
that, what could cause Spark to ignore it?

I'll appreciate any help in advance.

-- 
/PhD Student - EIS Group - Bonn University, Germany.//
//+49 1575 8482232/


Mime
View raw message