spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ZHANG Wei <wezh...@outlook.com>
Subject Re: Can I run Spark executors in a Hadoop cluster from a Kubernetes container
Date Mon, 20 Apr 2020 11:54:37 GMT
Looks like you'd like to submit Spark job out of Spark cluster, Apache Livy [https://livy.incubator.apache.org/]
worths a try, which provides a REST service for Spark in a Hadoop cluster.

Cheers,
-z

________________________________________
From: mailfordebu@gmail.com <mailfordebu@gmail.com>
Sent: Thursday, April 16, 2020 20:26
To: user
Subject: Can I run Spark executors in a Hadoop cluster from a Kubernetes container

Hi,
I want to deploy Spark client in a Kubernetes container. Further on , I want to run the spark
job in a Hadoop cluster (meaning the resources of the Hadoop cluster will be leveraged) but
call it from the K8S container. My question is whether this mode of implementation possible?
Do let me know please.
Thanks,
Debu

Sent from my iPhone
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message