spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jakob Odersky <ja...@odersky.com>
Subject Re: How to Disable or do minimal Logging for apache spark client Driver program?
Date Thu, 06 Oct 2016 20:38:28 GMT
You can change the kind of log messages that are shown by
calling "context.setLogLevel(<level>)" with an appropriate level:
ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN.
See http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.SparkContext@setLogLevel(logLevel:String):Unit
for further details.

Just one nitpick: when you say "I am also using spark standalone
mode and I don't submit jobs through command line. I just invoke
public static void main() of my driver program." are you
referring to spark local mode? It is possible to also run spark
applications in "distributed mode" (i.e. standalone, yarn or
mesos) just from the command line, however that will require
using spark's launcher interface and bundling your application in
a jar.

On Thu, Oct 6, 2016 at 9:27 AM, kant kodali <kanth909@gmail.com> wrote:
> How to Disable or do minimal Logging for apache spark client Driver program?
> I couldn't find this information on docs. By Driver program I mean the java
> program where I initialize spark context. It produces lot of INFO messages
> but I would like to know only when there is error or a Exception such as
> Nullpointer exception and so on. I am also using spark standalone mode and I
> don't submit jobs through command line. I just invoke public static void
> main() of my driver program.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message