spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From vinay Bajaj <>
Subject Fwd: Doubts regarding Shark
Date Thu, 08 May 2014 06:46:42 GMT

I have few questions regarding shark.

1) I have a table of 60 GB and i have total memory of 50 GB but when i try
to cache the table it get cached successfully. How shark caches the table
there was not enough memory to get the table in memory. And how cache
eviction policies (FIFO and LRU) works while caching the table. While
creating tables I am using cache type property as MEMORY (storage level:
memory and disk)

2) Sometime while running queries I get JavaOutOfMemory Exception but all
tables are cached successfully. Can you tell me the cases or some example
due to which that error can come.

Vinay Bajaj

View raw message