spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From FreePeter <>
Subject Running Spark in Local Mode
Date Sun, 29 Mar 2015 20:21:42 GMT

I am trying to use Spark for my own applications, and I am currently
profiling the performance with local mode, and I have a couple of questions:

1. When I set spark.master local[N], it means the will use up to N worker
*threads* on the single machine. Is this equivalent to say there are N
worker *nodes*  as described in
(So each worker node/thread are viewed separately and can have its own
executor for each application)

2. Is there anyway to set up the max memory used by each worker thread/node?
I only find we can set the memory for each executor? (spark.executor.mem)

Thank you!

View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message