spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <sro...@gmail.com>
Subject Re: adding shutdownmanagerhook to spark.
Date Mon, 13 May 2019 23:16:01 GMT
Spark just adds a hook to the mechanism that Hadoop exposes. You can
do the same. You shouldn't use Spark's.

On Mon, May 13, 2019 at 6:11 PM Nasrulla Khan Haris
<Nasrulla.Khan@microsoft.com.invalid> wrote:
>
> HI All,
>
>
>
> I am trying to add shutdown hook, but looks like shutdown manager object requires the
package to be spark only, is there any other API that can help me to do this ?
>
> https://github.com/apache/spark/blob/v2.4.0/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala
>
>
>
> I can see that hiveserver2 in https://github.com/apache/spark/commit/515708d5f33d5acdb4206c626192d1838f8e691f
uses ShutdownHookManager in different package. But It expects me to have the package name
“org.apache.spark”
>
>
>
> Thanks,
>
> Nasrulla
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Mime
View raw message