spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From FreePeter <wenlei....@gmail.com>
Subject Running Spark in Local Mode
Date Sun, 29 Mar 2015 20:21:42 GMT
Hi,

I am trying to use Spark for my own applications, and I am currently
profiling the performance with local mode, and I have a couple of questions:

1. When I set spark.master local[N], it means the will use up to N worker
*threads* on the single machine. Is this equivalent to say there are N
worker *nodes*  as described in
http://spark.apache.org/docs/latest/cluster-overview.html
(So each worker node/thread are viewed separately and can have its own
executor for each application)

2. Is there anyway to set up the max memory used by each worker thread/node?
I only find we can set the memory for each executor? (spark.executor.mem)

Thank you!





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-in-Local-Mode-tp22279.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message