spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Davies Liu <dav...@databricks.com>
Subject Re: Spark 1.2. loses often all executors
Date Fri, 20 Mar 2015 18:19:00 GMT
Maybe this is related to a bug in 1.2 [1], it's fixed in 1.2.2 (not
released), could checkout the 1.2 branch and verify that?

[1] https://issues.apache.org/jira/browse/SPARK-5788

On Fri, Mar 20, 2015 at 3:21 AM, mrm <maria@skimlinks.com> wrote:
> Hi,
>
> I recently changed from Spark 1.1. to Spark 1.2., and I noticed that it
> loses all executors whenever I have any Python code bug (like looking up a
> key in a dictionary that does not exist). In earlier versions, it would
> raise an exception but it would not lose all executors.
>
> Anybody with a similar problem?
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-2-loses-often-all-executors-tp22162.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message