I recall someone from the Spark team (TD?) saying that Spark 9.1 will change the logger and the circular loop error between slf4j and log4j wouldn’t show up.


Yet on Spark 9.1 I still get

SLF4J: Detected both log4j-over-slf4j.jar AND slf4j-log4j12.jar on the class path, preempting StackOverflowError.

SLF4J: See also http://www.slf4j.org/codes.html#log4jDelegationLoop for more details.


Any solutions?