spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From mailford...@gmail.com
Subject Can I run Spark executors in a Hadoop cluster from a Kubernetes container
Date Thu, 16 Apr 2020 12:26:49 GMT
Hi,
I want to deploy Spark client in a Kubernetes container. Further on , I want to run the spark
job in a Hadoop cluster (meaning the resources of the Hadoop cluster will be leveraged) but
call it from the K8S container. My question is whether this mode of implementation possible?
Do let me know please.
Thanks,
Debu

Sent from my iPhone
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message