spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Yinan Li (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-25194) Kubernetes - Define cpu and memory limit to init container
Date Wed, 22 Aug 2018 19:30:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-25194?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16589283#comment-16589283
] 

Yinan Li commented on SPARK-25194:
----------------------------------

The upcoming Spark 2.4 gets rid of the init-container and switch to running {{spark-submit}} in
client mode in the driver to download remote dependencies. Given that 2.4 is coming soon,
I would suggest waiting for and using it instead. 

> Kubernetes - Define cpu and memory limit to init container
> ----------------------------------------------------------
>
>                 Key: SPARK-25194
>                 URL: https://issues.apache.org/jira/browse/SPARK-25194
>             Project: Spark
>          Issue Type: New Feature
>          Components: Kubernetes
>    Affects Versions: 2.3.1
>            Reporter: Daniel Majano
>            Priority: Major
>              Labels: features
>
> Hi,
>  
> Recently I have started to work with spark under kubernetes. We have all our kubernetes
clusters with resources quotes, so if you want to do a deploy yo need to define container
cpu and memory limit.
>  
> With driver and executors this is ok due to with spark submit props you can define this
limits. But today for one of my projects, I need to load an external dependency. I have tried
to define the dependency with --jars and the link with https so then, the init container will
pop up and you don't have the possibility to define limits and the submitter failed due to
he can't start the pod with driver + init container.
>  
>  
> Thanks.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message