spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jacek Laskowski <ja...@japila.pl>
Subject Re: Application Timeout
Date Thu, 21 Jan 2021 11:39:43 GMT
Hi Brett,

No idea why it happens, but got curious about this "Cores" column being 0.
Is this always the case?

Pozdrawiam,
Jacek Laskowski
----
https://about.me/JacekLaskowski
"The Internals Of" Online Books <https://books.japila.pl/>
Follow me on https://twitter.com/jaceklaskowski

<https://twitter.com/jaceklaskowski>


On Tue, Jan 19, 2021 at 11:27 PM Brett Spark <blarsonspark@gmail.com> wrote:

> Hello!
> When using Spark Standalone & Spark 2.4.4 / 3.0.0 - we are seeing our
> standalone Spark "applications" timeout and show as "Finished" after around
> an hour of time.
>
> Here is a screenshot from the Spark master before it's marked as finished.
> [image: image.png]
> Here is a screenshot from the Spark master after it's marked as finished.
> (After over an hour of idle time).
> [image: image.png]
> Here are the logs from the Spark Master / Worker:
>
> spark-master-2d733568b2a7e82de7b2b09b6daa17e9-7cd4cfcddb-f84q7 master
> 2021-01-19 21:55:47,282 INFO master.Master: 172.32.3.66:34570 got
> disassociated, removing it.
> spark-master-2d733568b2a7e82de7b2b09b6daa17e9-7cd4cfcddb-f84q7 master
> 2021-01-19 21:55:52,095 INFO master.Master: 172.32.115.115:36556 got
> disassociated, removing it.
> spark-master-2d733568b2a7e82de7b2b09b6daa17e9-7cd4cfcddb-f84q7 master
> 2021-01-19 21:55:52,095 INFO master.Master: 172.32.115.115:37305 got
> disassociated, removing it.
> spark-master-2d733568b2a7e82de7b2b09b6daa17e9-7cd4cfcddb-f84q7 master
> 2021-01-19 21:55:52,096 INFO master.Master: Removing app
> app-20210119204911-0000
> spark-worker-2d733568b2a7e82de7b2b09b6daa17e9-7bbb75f9b6-8mv2b worker
> 2021-01-19 21:55:52,112 INFO shuffle.ExternalShuffleBlockResolver:
> Application app-20210119204911-0000 removed, cleanupLocalDirs = true
>
> Is there a setting that causes an application to timeout after an hour of
> a Spark application or Spark worker being idle?
>
> I would like to keep our Spark applications alive as long as possible.
>
> I haven't been able to find a setting in the Spark confs documentation
> that corresponds to this so i'm wondering if this is something that's hard
> coded.
>
> Please let me know,
> Thank you!
>

Mime
View raw message