spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Grega Kešpret <gr...@celtra.com>
Subject Re: sbt run with spark.ContextCleaner ERROR
Date Wed, 01 Oct 2014 07:03:40 GMT
We have the same problem. We call sc.stop() just before the end of the
application on driver and we get

ERROR: nio.AbstractNioSelector: Interrupted while wait for resources to be
released

This only happens when running the application through sbt. Is there a JIRA
for this already?
I'd be happy to help with this, but I would need some initial direction
where to look.

We are using Spark 1.1

AFAIK this didn't happen when we were using Spark 0.9.

 [image: Inline image 1]

Grega
--
[image: Inline image 1]*Grega Kešpret*
Senior Software Engineer, Analytics

M: +386.40.831.938 | Skype: gregakespret
celtra.com <http://www.celtra.com/> | @celtramobile
<http://www.twitter.com/celtramobile>

On Thu, May 8, 2014 at 12:13 AM, Nan Zhu <zhunanmcgill@gmail.com> wrote:

>  same problem +1,
>
> though does not change the program result
>
> --
> Nan Zhu
>
> On Tuesday, May 6, 2014 at 11:58 PM, Tathagata Das wrote:
>
> Okay, this needs to be fixed. Thanks for reporting this!
>
>
>
> On Mon, May 5, 2014 at 11:00 PM, wxhsdp <wxhsdp@gmail.com> wrote:
>
> Hi, TD
>
> i tried on v1.0.0-rc3 and still got the error
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/sbt-run-with-spark-ContextCleaner-ERROR-tp5304p5421.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
>
>
>

Mime
View raw message