spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Marcelo Vanzin (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (SPARK-8622) Spark 1.3.1 and 1.4.0 doesn't put executor working directory on executor classpath
Date Wed, 13 Feb 2019 21:18:00 GMT

     [ https://issues.apache.org/jira/browse/SPARK-8622?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Marcelo Vanzin resolved SPARK-8622.
-----------------------------------
    Resolution: Not A Problem

This works as designed. {{--jars}} are added to the Spark class loader (which is different
than the system classpath).

> Spark 1.3.1 and 1.4.0 doesn't put executor working directory on executor classpath
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-8622
>                 URL: https://issues.apache.org/jira/browse/SPARK-8622
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 1.3.1, 1.4.0
>            Reporter: Baswaraj
>            Priority: Major
>
> I ran into an issue that executor not able to pickup my configs/ function from my custom
jar in standalone (client/cluster) deploy mode. I have used spark-submit --Jar option to specify
all my jars and configs to be used by executors.
> all these files are placed in working directory of executor, but not in executor classpath.
 Also, executor working directory is not in executor classpath.
> I am expecting executor to find all files specified in spark-submit --jar options .
> In spark 1.3.0 executor working directory is in executor classpath, so app runs successfully.
> To successfully run my application with spark 1.3.1 +, i have to use  following option
 (conf/spark-defaults.conf)
> spark.executor.extraClassPath   .
> Please advice.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message