spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jacek Laskowski <ja...@japila.pl>
Subject Re: Does Apache Spark use any Dependency Injection framework?
Date Mon, 03 Apr 2017 08:41:06 GMT
Hi,

Answering your question from the title (that seems different from what's in
the email) and leaving the other part of how to do it using a DI framework
to others.

Spark does not use any DI framework internally and wires components itself.

Jacek

On 2 Apr 2017 3:29 p.m., "kant kodali" <kanth909@gmail.com> wrote:

> Hi All,
>
> I am wondering if can get SparkConf
> <https://spark.apache.org/docs/2.1.0/api/java/org/apache/spark/SparkConf.html>
object
> through Dependency Injection? I currently use HOCON
> <https://github.com/typesafehub/config/blob/master/HOCON.md> library to
> store all key/value pairs required to construct SparkConf. The problem is
> as I created multiple client jars(By client jars I mean the one we supply
> for spark-submit to run our App) where each of them requiring its own
> config It would be nice to have SparkConf created by the DI framework
> depending on the client jar we want to run. I am assuming someone must have
> done this?
>
> Thanks!
>

Mime
View raw message