spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Varshney, Vaibhav" <>
Subject RE: [Spark 3.0 Kubernetes] Does Spark 3.0 support production deployment
Date Thu, 09 Jul 2020 20:30:11 GMT
Thanks for response. We have tried it in dev env. For production, if Spark 3.0 is not leveraging
k8s scheduler, then would Spark Cluster in K8s be "static"? 
As per it seems it is still blocker for
production workloads?

Vaibhav V

-----Original Message-----
From: Sean Owen <> 
Sent: Thursday, July 9, 2020 3:20 PM
To: Varshney, Vaibhav (DI SW CAS MP AFC ARC) <>
Cc:; Ramani, Sai (DI SW CAS MP AFC ARC) <>
Subject: Re: [Spark 3.0 Kubernetes] Does Spark 3.0 support production deployment

I haven't used the K8S scheduler personally, but, just based on that comment I wouldn't worry
too much. It's been around for several versions and AFAIK works fine in general. We sometimes
aren't so great about removing "experimental" labels. That said I know there are still some
things that could be added to it and more work going on, and maybe people closer to that work
can comment. But yeah you shouldn't be afraid to try it.

On Thu, Jul 9, 2020 at 3:18 PM Varshney, Vaibhav <> wrote:
> Hi Spark Experts,
> We are trying to deploy spark on Kubernetes.
> As per doc, it looks like
K8s deployment is experimental.
> "The Kubernetes scheduler is currently experimental ".
> Spark 3.0 does not support production deployment using k8s scheduler?
> What’s the plan on full support of K8s scheduler?
> Thanks,
> Vaibhav V
View raw message