spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Yana Kadiyska <yana.kadiy...@gmail.com>
Subject Re: Problem with version compatibility
Date Thu, 25 Jun 2015 15:25:53 GMT
Jim, I do something similar to you. I mark all dependencies as provided and
then make sure to drop the same version of spark-assembly in my war as I
have on the executors. I don't remember if dropping in server/lib works, I
think I ran into an issue with that. Would love to know "best practices"
when it comes to Tomcat and Spark

On Thu, Jun 25, 2015 at 11:23 AM, Sean Owen <sowen@cloudera.com> wrote:

> Try putting your same Mesos assembly on the classpath of your client
> then, to emulate what spark-submit does. I don't think you merely also
> want to put it on the classpath but make sure nothing else from Spark
> is coming from your app.
>
> In 1.4 there is the 'launcher' API which makes programmatic access a
> lot more feasible but still kinda needs you to get Spark code to your
> driver program, and if it's not the same as on your cluster you'd
> still risk some incompatibilities.
>
> On Thu, Jun 25, 2015 at 6:05 PM, jimfcarroll <jimfcarroll@gmail.com>
> wrote:
> > Ah. I've avoided using spark-submit primarily because our use of Spark
> is as
> > part of an analytics library that's meant to be embedded in other
> > applications with their own lifecycle management.
> >
> > One of those application is a REST app running in tomcat which will make
> the
> > use of spark-submit difficult (if not impossible).
> >
> > Also, we're trying to avoid sending jars over the wire per-job and so we
> > install our library (minus the spark dependencies) on the mesos workers
> and
> > refer to it in the spark configuration using
> spark.executor.extraClassPath
> > and if I'm reading SparkSubmit.scala correctly, it looks like the user's
> > assembly ends up sent to the cluster (at least in the case of yarn)
> though I
> > could be wrong on this.
> >
> > Is there a standard way of running an app that's in control of it's own
> > runtime lifecycle without spark-submit?
> >
> > Thanks again.
> > Jim
> >
> >
> >
> >
> > --
> > View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Problem-with-version-compatibility-tp12861p12894.html
> > Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> > For additional commands, e-mail: dev-help@spark.apache.org
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>

Mime
View raw message