spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Soumya Simanta <soumya.sima...@gmail.com>
Subject Re: run multiple spark applications in parallel
Date Wed, 29 Oct 2014 00:31:28 GMT
Maybe changing --master yarn-cluster to --master yarn-client help.


On Tue, Oct 28, 2014 at 7:25 PM, Josh J <joshjdevl@gmail.com> wrote:

> Sorry, I should've included some stats with my email
>
> I execute each job in the following manner
>
> ./bin/spark-submit --class CLASSNAME --master yarn-cluster --driver-memory
> 1g --executor-memory 1g --executor-cores 1 UBER.JAR
> ${ZK_PORT_2181_TCP_ADDR} my-consumer-group1 1
>
>
> The box has
>
> 24 CPUs, Intel(R) Xeon(R) CPU E5-2420 v2 @ 2.20GHz
>
> 32 GB RAM
>
>
> Thanks,
>
> Josh
>
> On Tue, Oct 28, 2014 at 4:15 PM, Soumya Simanta <soumya.simanta@gmail.com>
> wrote:
>
>> Try reducing the resources (cores and memory) of each application.
>>
>>
>>
>> > On Oct 28, 2014, at 7:05 PM, Josh J <joshjdevl@gmail.com> wrote:
>> >
>> > Hi,
>> >
>> > How do I run multiple spark applications in parallel? I tried to run on
>> yarn cluster, though the second application submitted does not run.
>> >
>> > Thanks,
>> > Josh
>>
>
>

Mime
View raw message