spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Takeshi Yamamuro <linguin....@gmail.com>
Subject Re: Running Spark in local mode
Date Sun, 19 Jun 2016 08:52:15 GMT
Hi,

In a local mode, spark runs in a single JVM that has a master and one
executor with `k` threads.
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/local/LocalSchedulerBackend.scala#L94

// maropu


On Sun, Jun 19, 2016 at 5:39 PM, Ashok Kumar <ashok34668@yahoo.com.invalid>
wrote:

> Hi,
>
> I have been told Spark in Local mode is simplest for testing. Spark
> document covers little on local mode except the cores used in --master
> local[k].
>
> Where are the the driver program, executor and resources. Do I need to
> start worker threads and how many app I can use safely without exceeding
> memory allocated etc?
>
> Thanking you
>
>
>


-- 
---
Takeshi Yamamuro

Mime
View raw message