spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sambit Tripathy (RBEI/EDS1)" <Sambit.Tripa...@in.bosch.com>
Subject RE: Using Java spring injection with spark
Date Tue, 02 Feb 2016 07:24:06 GMT
Hi Harsh,

I still do not understand your problem completely. If this is what you are talking about http://stackoverflow.com/questions/30053449/use-spring-together-with-spark



Best regards
Sambit Tripathy


From: HARSH TAKKAR [mailto:takkarharsh@gmail.com]
Sent: Monday, February 01, 2016 10:28 PM
To: Sambit Tripathy (RBEI/EDS1) <Sambit.Tripathy@in.bosch.com>; user@spark.apache.org
Subject: Re: Using Java spring injection with spark

Hi Sambit
My app is basically a cron which checks on the db, if there is a job that is scheduled and
needs to be executed, and  it submits the job to spark using spark java api.This app is written
with spring framework as core.
Each job has set of task which needs to be executed in an order.
> we have implemented a chain of responsibility pattern to do so,and we persist a status
of snapshot step on mysql after persistence of the step result in csv file, So that if job
breaks in between we know which file to pick and start processing further.
> I basically wanted to inject the snapshot steps with jdbc layer and other dependency
through spring auto-wiring.
will using spring in this use case adversely affect the performance and as you mentioned will
it cause serialization errors ?


On Tue, Feb 2, 2016 at 1:16 AM Sambit Tripathy (RBEI/EDS1) <Sambit.Tripathy@in.bosch.com<mailto:Sambit.Tripathy@in.bosch.com>>
wrote:


1.      It depends on what you want to do. Don’t worry about singleton and wiring the beans
as it is pretty much taken care by the Spark Framework itself. Infact doing so, you will run
into issues like serialization errors.



2.      You can write your code using Scala/ Python using the spark shell or a notebook like
Ipython, zeppelin  or if you have written a application using Scala/Java using the Spark API
you can create a jar and run it using spark-submit.
From: HARSH TAKKAR [mailto:takkarharsh@gmail.com<mailto:takkarharsh@gmail.com>]
Sent: Monday, February 01, 2016 10:00 AM
To: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Re: Using Java spring injection with spark


Hi

Please can anyone reply on this.

On Mon, 1 Feb 2016, 4:28 p.m. HARSH TAKKAR <takkarharsh@gmail.com<mailto:takkarharsh@gmail.com>>
wrote:
Hi
I am new to apache spark and big data analytics, before starting to code on spark data frames
and rdd, i just wanted to confirm following
1. Can we create an implementation of java.api.Function as a singleton bean using the spring
frameworks and, can it be injected using autowiring to other classes.
2. what is the best way to submit jobs to spark , using the api or using the shell script?
Looking forward for your help,
Kind Regards
Harsh
Mime
View raw message