spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dmitriy Lyubimov <dlie...@gmail.com>
Subject Re: Checking which RDDs still might be cached?
Date Thu, 12 Sep 2013 01:50:31 GMT
thank you .

any temporary spark hack will do actually (even if i need to enter
some code in a spark class), it is for debugging only of my use of
unpersist().
I think Matei's solution will do what i need. Thank you.

On Wed, Sep 11, 2013 at 5:43 PM, Matei Zaharia <matei.zaharia@gmail.com> wrote:
> You can actually do SparkContext.getExecutorStorageStatus to get a list of stored blocks.
These have a special name when they belong to an RDD, using that RDD's id field. But unfortunately
there's no way to get this info from the RDD itself.
>
> Matei
>
> On Sep 11, 2013, at 4:52 PM, Dmitriy Lyubimov <dlieu.7@gmail.com> wrote:
>
>> Hello,
>>
>> is there's any way to interrogate block manager as to what RDDs might
>> still be cached in the session (spark 0.8)?
>>
>> thanks in advance.
>

Mime
View raw message