spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From İbrahim Rıza HALLAÇ <ibrahimrizahal...@live.com>
Subject spark setting maximum available memory
Date Thu, 22 May 2014 14:23:26 GMT
In my situation each slave has 8 GB memory.  I want to use the maximum memory that I can: .set("spark.executor.memory",
"?g") How can I determine the amount of memory I should set ? It fails when I set it to 8GB.
		 	   		  
Mime
View raw message