spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Jain, Abhishek 3. (Nokia - IN/Bangalore)" <>
Subject RE: Spark streaming filling the disk with logs
Date Thu, 14 Feb 2019 12:28:25 GMT
Hi Deepak,

The spark logging can be set for different purposes. Say for example if you want to control
the spark-submit log, “” can be

Similarly, to control third party logs:<LEVEL>,<LEVEL>

These properties can be set in the conf/log4j .properties file.

Hope this helps! 😊

Abhishek Jain

From: Deepak Sharma <>
Sent: Thursday, February 14, 2019 12:10 PM
To: spark users <>
Subject: Spark streaming filling the disk with logs

Hi All
I am running a spark streaming job with below configuration :

--conf "spark.executor.extraJavaOptions=-Droot.logger=WARN,console"

But it’s still filling the disk with info logs.
If the logging level is set to WARN at cluster level , then only the WARN logs are getting
written but then it affects all the jobs .

Is there any way to get rid of INFO level of logging at spark streaming job level ?


View raw message