spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From WangTaoTheTonic <barneystin...@aliyun.com>
Subject Who manage the log4j appender while running spark on yarn?
Date Fri, 19 Dec 2014 08:37:05 GMT
Hi guys, 

I recently ran spark on yarn and found spark didn't set any log4j properties
file in configuration or code. And the log4j logs was writing into stderr
file under ${yarn.nodemanager.log-dirs}/application_${appid}.

I wanna know which side(spark or hadoop) controll the appender? Have found
that related disscussion here:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-logging-strategy-on-YARN-td8751.html,
but I think spark code has changed a lot since then.

Any one could offer some guide? Thanks.





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Who-manage-the-log4j-appender-while-running-spark-on-yarn-tp20778.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message