spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mayur Rustagi <mayur.rust...@gmail.com>
Subject Re: Doubts regarding Shark
Date Tue, 13 May 2014 11:45:24 GMT
The table will be cached but 10GB (Most likely more) would be on disk. You
can check that in the storage tab in shark application.

Java out of memory could be as your worker memory is too low or memory
allocated to Shark is too low.


Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi <https://twitter.com/mayur_rustagi>



On Thu, May 8, 2014 at 12:42 AM, vinay Bajaj <vbajaj2610@gmail.com> wrote:

> Hello
>
> I have few questions regarding shark.
>
> 1) I have a table of 60 GB and i have total memory of 50 GB but when i try
> to cache the table it get cached successfully. How shark caches the table
> there was not enough memory to get the table in memory. And how cache
> eviction policies (FIFO and LRU) works while caching the table. While
> creating tables I am using cache type property as MEMORY (storage level:
> memory and disk)
>
> 2) Sometime while running queries I get JavaOutOfMemory Exception but all
> tables are cached successfully. Can you tell me the cases or some example
> due to which that error can come.
>
> Regards
> Vinay Bajaj
>

Mime
View raw message