spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vadim Semenov <va...@datadoghq.com>
Subject Re: Spark on YARN, HowTo kill executor or individual task?
Date Tue, 12 Feb 2019 18:39:01 GMT
Yeah, then the easiest would be to fork spark and run using the forked
version, and in case of YARN it should be pretty easy to do.

git clone https://github.com/apache/spark.git

cd spark

export MAVEN_OPTS="-Xmx4g -XX:ReservedCodeCacheSize=512m"

./build/mvn -DskipTests clean package

./dev/make-distribution.sh --name custom-spark --tgz -Phadoop-2.7 -Phive
-Pyarn

ls -la spark-2.4.0-SNAPSHOT-bin-custom-spark.tgz

scp spark-2.4.0-SNAPSHOT-bin-custom-spark.tgz cluster:/tmp

export SPARK_HOME="/tmp/spark-2.3.0-SNAPSHOT-bin-custom-spark"

cd $SPARK_HOME
mv conf conf.new
ln -s /etc/spark/conf conf

echo $SPARK_HOME
spark-submit --version

On Tue, Feb 12, 2019 at 6:40 AM Serega Sheypak <serega.sheypak@gmail.com>
wrote:
>
> I tried a similar approach, it works well for user functions. but I need
to crash tasks or executor when spark application runs "repartition". I
didn't any away to inject "poison pill" into repartition call :(
>
> пн, 11 февр. 2019 г. в 21:19, Vadim Semenov <vadim@datadoghq.com>:
>>
>> something like this
>>
>> import org.apache.spark.TaskContext
>> ds.map(r => {
>>   val taskContext = TaskContext.get()
>>   if (taskContext.partitionId == 1000) {
>>     throw new RuntimeException
>>   }
>>   r
>> })
>>
>> On Mon, Feb 11, 2019 at 8:41 AM Serega Sheypak <serega.sheypak@gmail.com>
wrote:
>> >
>> > I need to crash task which does repartition.
>> >
>> > пн, 11 февр. 2019 г. в 10:37, Gabor Somogyi <gabor.g.somogyi@gmail.com
>:
>> >>
>> >> What blocks you to put if conditions inside the mentioned map
function?
>> >>
>> >> On Mon, Feb 11, 2019 at 10:31 AM Serega Sheypak <
serega.sheypak@gmail.com> wrote:
>> >>>
>> >>> Yeah, but I don't need to crash entire app, I want to fail several
tasks or executors and then wait for completion.
>> >>>
>> >>> вс, 10 февр. 2019 г. в 21:49, Gabor Somogyi <
gabor.g.somogyi@gmail.com>:
>> >>>>
>> >>>> Another approach is adding artificial exception into the
application's source code like this:
>> >>>>
>> >>>> val query = input.toDS.map(_ /
0).writeStream.format("console").start()
>> >>>>
>> >>>> G
>> >>>>
>> >>>>
>> >>>> On Sun, Feb 10, 2019 at 9:36 PM Serega Sheypak <
serega.sheypak@gmail.com> wrote:
>> >>>>>
>> >>>>> Hi BR,
>> >>>>> thanks for your reply. I want to mimic the issue and kill tasks
at
a certain stage. Killing executor is also an option for me.
>> >>>>> I'm curious how do core spark contributors test spark fault
tolerance?
>> >>>>>
>> >>>>>
>> >>>>> вс, 10 февр. 2019 г. в 16:57, Gabor Somogyi <
gabor.g.somogyi@gmail.com>:
>> >>>>>>
>> >>>>>> Hi Serega,
>> >>>>>>
>> >>>>>> If I understand your problem correctly you would like to
kill one
executor only and the rest of the app has to be untouched.
>> >>>>>> If that's true yarn -kill is not what you want because it
stops
the whole application.
>> >>>>>>
>> >>>>>> I've done similar thing when tested/testing Spark's HA features.
>> >>>>>> - jps -vlm | grep
"org.apache.spark.executor.CoarseGrainedExecutorBackend.*applicationid"
>> >>>>>> - kill -9 pidofoneexecutor
>> >>>>>>
>> >>>>>> Be aware if it's a multi-node cluster check whether at least
one
process runs on a specific node(it's not required).
>> >>>>>> Happy killing...
>> >>>>>>
>> >>>>>> BR,
>> >>>>>> G
>> >>>>>>
>> >>>>>>
>> >>>>>> On Sun, Feb 10, 2019 at 4:19 PM Jörn Franke <jornfranke@gmail.com>
wrote:
>> >>>>>>>
>> >>>>>>> yarn application -kill applicationid ?
>> >>>>>>>
>> >>>>>>> > Am 10.02.2019 um 13:30 schrieb Serega Sheypak <
serega.sheypak@gmail.com>:
>> >>>>>>> >
>> >>>>>>> > Hi there!
>> >>>>>>> > I have weird issue that appears only when tasks
fail at
specific stage. I would like to imitate failure on my own.
>> >>>>>>> > The plan is to run problematic app and then kill
entire
executor or some tasks when execution reaches certain stage.
>> >>>>>>> >
>> >>>>>>> > Is it do-able?
>> >>>>>>>
>> >>>>>>>
---------------------------------------------------------------------
>> >>>>>>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>> >>>>>>>
>>
>>
>> --
>> Sent from my iPhone



-- 
Sent from my iPhone

Mime
View raw message