spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Niu Zhaojie <>
Subject pyspark.daemon exhaust a lot of memory
Date Tue, 10 Apr 2018 04:35:06 GMT
Hi All,

We are running spark 2.1.1 on Hadoop YARN 2.6.5.

We found the pyspark.daemon process consume more than 300GB memory.

However, according to, the
daemon process shouldn't have this problem.

Also, we find the daemon process is forked by the container process,
obviously it already beyonds the container memory limit, why YARN doesn't
kill this container?


View raw message