spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ashok Kumar <ashok34...@yahoo.com.INVALID>
Subject Re: Running Spark in local mode
Date Sun, 19 Jun 2016 09:14:39 GMT
thank you 
What are the main differences between a local mode and standalone mode. I understand local
mode does not support cluster. Is that the only difference?
 

    On Sunday, 19 June 2016, 9:52, Takeshi Yamamuro <linguin.m.s@gmail.com> wrote:
 

 Hi,
In a local mode, spark runs in a single JVM that has a master and one executor with `k` threads.https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/local/LocalSchedulerBackend.scala#L94

// maropu

On Sun, Jun 19, 2016 at 5:39 PM, Ashok Kumar <ashok34668@yahoo.com.invalid> wrote:

Hi,
I have been told Spark in Local mode is simplest for testing. Spark document covers little
on local mode except the cores used in --master local[k]. 
Where are the the driver program, executor and resources. Do I need to start worker threads
and how many app I can use safely without exceeding memory allocated etc?
Thanking you





-- 
---
Takeshi Yamamuro


  
Mime
View raw message