spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chetan Khatri <chetan.opensou...@gmail.com>
Subject Re: Dependency Injection and Microservice development with Spark
Date Wed, 04 Jan 2017 11:34:48 GMT
Lars,

Thank you, I want to use DI for configuring all the properties (wiring) for
below architectural approach.

Oracle -> Kafka Batch (Event Queuing) -> Spark Jobs( Incremental load from
HBase -> Hive with Transformation) -> Spark Transformation -> PostgreSQL

Thanks.

On Thu, Dec 29, 2016 at 3:25 AM, Lars Albertsson <lalle@mapflat.com> wrote:

> Do you really need dependency injection?
>
> DI is often used for testing purposes. Data processing jobs are easy
> to test without DI, however, due to their functional and synchronous
> nature. Hence, DI is often unnecessary for testing data processing
> jobs, whether they are batch or streaming jobs.
>
> Or do you want to use DI for other reasons?
>
>
> Lars Albertsson
> Data engineering consultant
> www.mapflat.com
> https://twitter.com/lalleal
> +46 70 7687109
> Calendar: https://goo.gl/6FBtlS, https://freebusy.io/lalle@mapflat.com
>
>
> On Fri, Dec 23, 2016 at 11:56 AM, Chetan Khatri
> <chetan.opensource@gmail.com> wrote:
> > Hello Community,
> >
> > Current approach I am using for Spark Job Development with Scala + SBT
> and
> > Uber Jar with yml properties file to pass configuration parameters. But
> If i
> > would like to use Dependency Injection and MicroService Development like
> > Spring Boot feature in Scala then what would be the standard approach.
> >
> > Thanks
> >
> > Chetan
>

Mime
View raw message