spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dave Jaffe <>
Subject Spark on Kubernetes - not read
Date Tue, 11 Jun 2019 01:15:14 GMT
I am using Spark on Kubernetes from Spark 2.4.3. I have created a file in
my local spark/conf directory and modified it so that the console (or, in the case of Kubernetes,
the log) only shows warnings and higher (log4j.rootCategory=WARN, console). I then added the
COPY conf /opt/spark/conf
to /root/spark/kubernetes/dockerfiles/spark/Dockerfile and built a new container.

However, when I run that under Kubernetes, the program runs successfully but /opt/spark/conf/
is not used (I still see the INFO lines when I run kubectl logs <driver pod>).

I have tried other things such as explicitly adding a –properties-file to my spark-submit
command and even
--conf spark.driver.extraJavaOptions=-Dlog4j.configuration=file:///opt/spark/conf/

My file is never seen.

How do I customize with Kubernetes?

Thanks, Dave Jaffe

View raw message