spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rachana Srivastava <Rachana.Srivast...@markmonitor.com>
Subject RE: Unable to increase Active Tasks of a Spark Streaming Process in Yarn
Date Wed, 22 Jun 2016 18:01:01 GMT
Not sure why number of active jobs are always 1 regardless of number of partitions, executors
etc.  Can anyone please guide me what drives this Active Job.

[cid:image003.png@01D1CC75.5C4E4B90]

From: Rachana Srivastava
Sent: Wednesday, June 22, 2016 10:33 AM
To: 'user@spark.apache.org'; 'dev@spark.apache.org'
Subject: RE: Unable to increase Active Tasks of a Spark Streaming Process in Yarn

Here are some more details any pointer is really appreciated:

I have configured number of partitions at Kafka level as 40.  Number of repartition of spark
streaming as 40.  I have disabled dynamic allocation for spark.

From: Rachana Srivastava
Sent: Wednesday, June 22, 2016 8:44 AM
To: 'user@spark.apache.org'; 'dev@spark.apache.org'
Subject: Unable to increase Active Tasks of a Spark Streaming Process in Yarn

Hello all,

I am running a spark streaming process where I got a batch of 6000 events.  But when I look
at executors only one active task is running.  I tried dynamic allocation and as well as setting
number of executors etc.  Even if I have 15 executors only one active task is running at a
time.  Can any one please guide me what am I doing wrong here.

[cid:image004.png@01D1CC75.5C4E4B90]

[cid:image005.png@01D1CC75.5C4E4B90]

Mime
View raw message