spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Burak Yavuz <brk...@gmail.com>
Subject Re: How to cause a stage to fail (using spark-shell)?
Date Sun, 19 Jun 2016 03:25:35 GMT
Hi Jacek,

Can't you simply have a mapPartitions task throw an exception or something?
Are you trying to do something more esoteric?

Best,
Burak

On Sat, Jun 18, 2016 at 5:35 AM, Jacek Laskowski <jacek@japila.pl> wrote:

> Hi,
>
> Following up on this question, is a stage considered failed only when
> there is a FetchFailed exception? Can I have a failed stage with only
> a single-stage job?
>
> Appreciate any help on this...(as my family doesn't like me spending
> the weekend with Spark :))
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sat, Jun 18, 2016 at 11:53 AM, Jacek Laskowski <jacek@japila.pl> wrote:
> > Hi,
> >
> > I'm trying to see some stats about failing stages in web UI and want
> > to "create" few failed stages. Is this possible using spark-shell at
> > all? Which setup of Spark/spark-shell would allow for such a scenario.
> >
> > I could write a Scala code if that's the only way to have failing stages.
> >
> > Please guide. Thanks.
> >
> > /me on to reviewing the Spark code...
> >
> > Pozdrawiam,
> > Jacek Laskowski
> > ----
> > https://medium.com/@jaceklaskowski/
> > Mastering Apache Spark http://bit.ly/mastering-apache-spark
> > Follow me at https://twitter.com/jaceklaskowski
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Mime
View raw message