spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Maciej Bryński (JIRA) <>
Subject [jira] [Created] (SPARK-15344) Unable to set default log level for PySpark
Date Mon, 16 May 2016 12:28:12 GMT
Maciej Bryński created SPARK-15344:

             Summary: Unable to set default log level for PySpark
                 Key: SPARK-15344
             Project: Spark
          Issue Type: Bug
          Components: PySpark
    Affects Versions: 2.0.0
            Reporter: Maciej Bryński
            Priority: Minor

After this patch:
I'm unable to set default log level for Pyspark.
It's always WARN.

Below setting doesn't work: 
mbrynski@jupyter:~/spark$ cat conf/
# Set everything to be logged to the console
log4j.rootCategory=INFO, console
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

# Set the default spark-shell log level to WARN. When running the spark-shell, the
# log level for this class is used to overwrite the root logger's log level, so that
# the user can have different defaults for the shell and regular Spark apps.

# Settings to quiet third party logs that are too verbose$exprTyper=INFO$SparkILoopInterpreter=INFO

# SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL
with Hive support


This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message