1.in order to change log4j.properties at the name node, u can change /home/hadoop/log4j.properties.

2.in order to change log4j.properties for the container logs, u need to change it at the yarn containers jar, since they hard-coded loading the file directly from project resources.

2.1 ssh to the slave (on EMR u can also simply add this as bootstrap action, so u dont need to ssh to each of the nodes).

2.2 override the container-log4j.properties at the jar resources:

jar uf /home/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar container-log4j.properties


On 8 September 2015 at 05:47, Yana Kadiyska <yana.kadiyska@gmail.com> wrote:
Hopefully someone will give you a more direct answer but whenever I'm having issues with log4j I always try -Dlog4j.debug=true.This will tell you which log4j settings are getting picked up from where. I've spent countless hours due to typos in the file, for example.

On Mon, Sep 7, 2015 at 11:47 AM, Jeetendra Gangele <gangele397@gmail.com> wrote:
I also tried placing my costomized log4j.properties file under src/main/resources still no luck.

won't above step modify the default YARN and spark  log4j.properties  ?

anyhow its still taking log4j.properties from YARn.



On 7 September 2015 at 19:25, Jeetendra Gangele <gangele397@gmail.com> wrote:
anybody here to help?



On 7 September 2015 at 17:53, Jeetendra Gangele <gangele397@gmail.com> wrote:
Hi All I have been trying to send my application related logs to socket so that we can write log stash and check the application logs.

here is my log4j.property file

main.logger=RFA,SA

log4j.appender.SA=org.apache.log4j.net.SocketAppender
log4j.appender.SA.Port=4560
log4j.appender.SA.RemoteHost=hadoop07.housing.com
log4j.appender.SA.ReconnectionDelay=10000
log4j.appender.SA.Application=NM-${user.dir}
# Ignore messages below warning level from Jetty, because it's a bit verbose
log4j.logger.org.spark-project.jetty=WARN
log4j.logger.org.apache.hadoop=WARN


I am launching my spark job using below common on YARN-cluster mode

spark-submit --name data-ingestion --master yarn-cluster --conf spark.custom.configuration.file=hdfs://10.1.6.186/configuration/binning-dev.conf --files /usr/hdp/current/spark-client/Runnable/conf/log4j.properties --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=log4j.properties" --conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=log4j.properties" --class com.housing.spark.streaming.Binning /usr/hdp/current/spark-client/Runnable/dsl-data-ingestion-all.jar


Can anybody please guide me why i am not getting the logs the socket?


I followed many pages listing below without success