spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <so...@cloudera.com>
Subject Re: Configuring Spark Memory
Date Wed, 23 Jul 2014 15:22:45 GMT
On Wed, Jul 23, 2014 at 4:19 PM, Andrew Ash <andrew@andrewash.com> wrote:
>
>
> In standalone mode, each SparkContext you initialize gets its own set of
> executors across the cluster.  So for example if you have two shells open,
> they'll each get two JVMs on each worker machine in the cluster.
>

Dumb question offline -- do you mean they'll each get one JVM on each
worker? or if it's two, what drives the two each?

Mime
View raw message