spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
Subject Can I run Spark executors in a Hadoop cluster from a Kubernetes container
Date Thu, 16 Apr 2020 12:26:49 GMT
I want to deploy Spark client in a Kubernetes container. Further on , I want to run the spark
job in a Hadoop cluster (meaning the resources of the Hadoop cluster will be leveraged) but
call it from the K8S container. My question is whether this mode of implementation possible?
Do let me know please.

Sent from my iPhone
To unsubscribe e-mail:

View raw message