spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Serega Sheypak <serega.shey...@gmail.com>
Subject Kill spark executor when spark runs specific stage
Date Wed, 04 Jul 2018 17:04:33 GMT
Hi, I'm running spark on YARN. My code is very simple. I want to kill one
executor when "data.repartition(10)" is executed. Ho can I do it in easy
way?


val data = sc.sequenceFile[NullWritable, BytesWritable](inputPath)
.map { case (key, value) =>
Data.fromBytes(value)
}

process = data.repartition(10) // kill one executor here
process.map { d =>
val data = d.toByteArray
(new AvroKey(ByteBuffer.wrap(data)), NullWritable.get())
}
.saveAsNewAPIHadoopFile[AvroKeyOutputFormat[ByteBuffer]](outputPath)

Mime
View raw message