spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Yana Kadiyska <yana.kadiy...@gmail.com>
Subject Re: Changing log level of spark
Date Tue, 01 Jul 2014 13:53:16 GMT
Are you looking at the driver log? (e.g. Shark?). I see a ton of
information in the INFO category on what query is being started, what
stage is starting and which executor stuff is sent to. So I'm not sure
if you're saying you see all that and you need more, or that you're
not seeing this type of information. I cannot speak to the ec2 setup,
just pointing out that under 0.9.1 I see quite a bit of scheduling
information in the driver log.

On Tue, Jul 1, 2014 at 9:20 AM, Philip Limbeck <philiplimbeck@gmail.com> wrote:
> We changed the loglevel to DEBUG by replacing every INFO with DEBUG in
> /root/ephemeral-hdfs/conf/log4j.properties and propagating it to the
> cluster. There is some DEBUG output visible in both master and worker but
> nothing really interesting regarding stages or scheduling. Since we expected
> a little more than that, there could be 2 possibilites:
>   a) There is still some other unknown way to set the loglevel to debug
>   b) There is not that much log output to be expected in this direction, I
> looked for "logDebug" (The log wrapper in spark) in github with 84 results,
> which means that I doubt that there is not much else to expect.
>
> We actually just want to have a little more insight into the system behavior
> especially when using Shark since we ran into some serious concurrency
> issues with blocking queries. So much for the background why this is
> important to us.
>
>
> On Thu, Jun 26, 2014 at 3:30 AM, Aaron Davidson <ilikerps@gmail.com> wrote:
>>
>> If you're using the spark-ec2 scripts, you may have to change
>> /root/ephemeral-hdfs/conf/log4j.properties or something like that, as that
>> is added to the classpath before Spark's own conf.
>>
>>
>> On Wed, Jun 25, 2014 at 6:10 PM, Tobias Pfeiffer <tgp@preferred.jp> wrote:
>>>
>>> I have a log4j.xml in src/main/resources with
>>>
>>> <?xml version="1.0" encoding="UTF-8" ?>
>>> <!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">
>>> <log4j:configuration xmlns:log4j="http://jakarta.apache.org/log4j/">
>>>     [...]
>>>     <root>
>>>         <priority value ="warn" />
>>>         <appender-ref ref="Console" />
>>>     </root>
>>> </log4j:configuration>
>>>
>>> and that is included in the jar I package with `sbt assembly`. That
>>> works fine for me, at least on the driver.
>>>
>>> Tobias
>>>
>>> On Wed, Jun 25, 2014 at 2:25 PM, Philip Limbeck <philiplimbeck@gmail.com>
>>> wrote:
>>> > Hi!
>>> >
>>> > According to
>>> >
>>> > https://spark.apache.org/docs/0.9.0/configuration.html#configuring-logging,
>>> > changing log-level is just a matter of creating a log4j.properties
>>> > (which is
>>> > in the classpath of spark) and changing log level there for the root
>>> > logger.
>>> > I did this steps on every node in the cluster (master and worker
>>> > nodes).
>>> > However, after restart there is still no debug output as desired, but
>>> > only
>>> > the default info log level.
>>
>>
>

Mime
View raw message