spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Paul Brown <...@mult.ifario.us>
Subject 0.9.0 forces log4j usage
Date Fri, 07 Feb 2014 07:41:02 GMT
We have a few applications that embed Spark, and in 0.8.0 and 0.8.1, we
were able to use slf4j, but 0.9.0 broke that and unintentionally forces
direct use of log4j as the logging backend.

The issue is here in the org.apache.spark.Logging trait:

https://github.com/apache/incubator-spark/blame/master/core/src/main/scala/org/apache/spark/Logging.scala#L107

log4j-over-slf4j *always* returns an empty enumeration for appenders to the
ROOT logger:

https://github.com/qos-ch/slf4j/blob/master/log4j-over-slf4j/src/main/java/org/apache/log4j/Category.java?source=c#L81

And this causes an infinite loop and an eventual stack overflow.

I'm happy to submit a Jira and a patch, but it would be significant enough
reversal of recent changes that it's probably worth discussing before I
sink a half hour into it.  My suggestion would be that initialization (or
not) should be left to the user with reasonable default behavior supplied
by the spark commandline tooling and not forced on applications that
incorporate Spark.

Thoughts/opinions?

-- Paul
—
prb@mult.ifario.us | Multifarious, Inc. | http://mult.ifario.us/

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message