spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Larry Xiao <xia...@sjtu.edu.cn>
Subject Re: spark.shuffle.consolidateFiles seems not working
Date Wed, 30 Jul 2014 08:29:45 GMT
Hi Jianshi,

I've met similar situation before.
And my solution was 'ulimit', you can use

-a to see your current settings
-n to set open files limit
(and other limits also)

And I set -n to 10240.

I see spark.shuffle.consolidateFiles helps by reusing open files.
(so I don't know to what extend does it help)

Hope it helps.

Larry

On 7/30/14, 4:01 PM, Jianshi Huang wrote:
> I'm using Spark 1.0.1 on Yarn-Client mode.
>
> SortByKey always reports a FileNotFoundExceptions with messages says 
> "too many open files".
>
> I already set spark.shuffle.consolidateFiles to true:
>
>   conf.set("spark.shuffle.consolidateFiles", "true")
>
> But it seems not working. What are the other possible reasons? How to 
> fix it?
>
> Jianshi
>
> -- 
> Jianshi Huang
>
> LinkedIn: jianshi
> Twitter: @jshuang
> Github & Blog: http://huangjs.github.com/


Mime
View raw message