spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nirav Patel <npa...@xactlycorp.com>
Subject Sharing spark executor pool across multiple long running spark applications
Date Tue, 06 Feb 2018 20:00:16 GMT
Currently sparkContext and it's executor pool is not shareable. Each
spakContext gets its own executor pool for entire life of an application.
So what is the best ways to share cluster resources across multiple long
running spark applications?

Only one I see is spark dynamic allocation but it has high latency when it
comes to real-time application.

-- 


[image: What's New with Xactly] <http://www.xactlycorp.com/email-click/>

<https://www.instagram.com/xactlycorp/>   
<https://www.linkedin.com/company/xactly-corporation>   
<https://twitter.com/Xactly>   <https://www.facebook.com/XactlyCorp>   
<http://www.youtube.com/xactlycorporation>

Mime
View raw message