spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Takeshi Yamamuro <linguin....@gmail.com>
Subject Re: Running Spark in local mode
Date Sun, 19 Jun 2016 09:39:37 GMT
There are many technical differences inside though, how to use is the
almost same with each other.
yea, in a standalone mode, spark runs in a cluster way: see
http://spark.apache.org/docs/1.6.1/cluster-overview.html

// maropu

On Sun, Jun 19, 2016 at 6:14 PM, Ashok Kumar <ashok34668@yahoo.com> wrote:

> thank you
>
> What are the main differences between a local mode and standalone mode. I
> understand local mode does not support cluster. Is that the only difference?
>
>
>
> On Sunday, 19 June 2016, 9:52, Takeshi Yamamuro <linguin.m.s@gmail.com>
> wrote:
>
>
> Hi,
>
> In a local mode, spark runs in a single JVM that has a master and one
> executor with `k` threads.
>
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/local/LocalSchedulerBackend.scala#L94
>
> // maropu
>
>
> On Sun, Jun 19, 2016 at 5:39 PM, Ashok Kumar <ashok34668@yahoo.com.invalid
> > wrote:
>
> Hi,
>
> I have been told Spark in Local mode is simplest for testing. Spark
> document covers little on local mode except the cores used in --master
> local[k].
>
> Where are the the driver program, executor and resources. Do I need to
> start worker threads and how many app I can use safely without exceeding
> memory allocated etc?
>
> Thanking you
>
>
>
>
>
> --
> ---
> Takeshi Yamamuro
>
>
>


-- 
---
Takeshi Yamamuro

Mime
View raw message