spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From charles li <charles.up...@gmail.com>
Subject spark.executor.memory ? is used just for cache RDD or both cache RDD and the runtime of cores on worker?
Date Fri, 05 Feb 2016 05:26:26 GMT
if set spark.executor.memory = 2G for each worker [ 10 in total ]

does it mean I can cache 20G RDD in memory ? if so, how about the memory
for code running in each process on each worker?

thanks.


--
and is there any materials about memory management or resource management
in spark ? I want to put spark in production, but have little knowing about
the resource management in spark, great thanks again


-- 
*--------------------------------------*
a spark lover, a quant, a developer and a good man.

http://github.com/litaotao

Mime
View raw message