spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Diana Carroll <>
Subject logging in pyspark
Date Tue, 06 May 2014 19:31:50 GMT
What should I do if I want to log something as part of a task?

This is what I tried.  To set up a logger, I followed the advice here:

logger = logging.getLogger("py4j")

This works fine when I call it from my driver (ie pyspark):"this works fine")

But I want to try logging within a distributed task so I did this:

def logTestMap(a):"test")
    return a

and got:
PicklingError: Can't pickle 'lock' object

So it's trying to serialize my function and can't because of a lock object
used in logger, presumably for thread-safeness.  But would I do
it?  Or is this just a really bad idea?


View raw message