spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Deepak Sharma <>
Subject Spark streaming filling the disk with logs
Date Thu, 14 Feb 2019 06:40:07 GMT
Hi All
I am running a spark streaming job with below configuration :

--conf "spark.executor.extraJavaOptions=-Droot.logger=WARN,console"

But it’s still filling the disk with info logs.
If the logging level is set to WARN at cluster level , then only the WARN
logs are getting written but then it affects all the jobs .

Is there any way to get rid of INFO level of logging at spark streaming job
level ?



View raw message