Answering your question from the title (that seems different from what's in the email) and leaving the other part of how to do it using a DI framework to others. 

Spark does not use any DI framework internally and wires components itself.


On 2 Apr 2017 3:29 p.m., "kant kodali" <kanth909@gmail.com> wrote:
Hi All,

I am wondering if can get SparkConf object through Dependency Injection? I currently use HOCON library to store all key/value pairs required to construct SparkConf. The problem is as I created multiple client jars(By client jars I mean the one we supply for spark-submit to run our App) where each of them requiring its own config It would be nice to have SparkConf created by the DI framework depending on the client jar we want to run. I am assuming someone must have done this?