spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Donald Szeto <don...@prediction.io>
Subject Re: Spark and Play
Date Wed, 12 Nov 2014 20:17:18 GMT
Hi Akshat,

If your application is to serve results directly from a SparkContext, you
may want to take a look at http://prediction.io. It integrates Spark with
spray.io (another REST/web toolkit by Typesafe). Some heavy lifting is done
here:
https://github.com/PredictionIO/PredictionIO/blob/develop/core/src/main/scala/workflow/CreateServer.scala

Regards,
Donald
ᐧ

On Tue, Nov 11, 2014 at 11:35 PM, John Meehan <jnmeehan@gmail.com> wrote:

> You can also build a Play 2.2.x + Spark 1.1.0 fat jar with sbt-assembly
> for, e.g. yarn-client support or using with spark-shell for debugging:
>
> play.Project.playScalaSettings
>
> libraryDependencies ~= { _ map {
>   case m if m.organization == "com.typesafe.play" =>
>     m.exclude("commons-logging", "commons-logging")
>   case m => m
> }}
>
> assemblySettings
>
> test in assembly := {}
>
> mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
>   {
>     case m if m.toLowerCase.endsWith("manifest.mf") =>
> MergeStrategy.discard
>     case m if m.startsWith("META-INF") => MergeStrategy.discard
>     case PathList("javax", "servlet", xs @ _*) => MergeStrategy.first
>     case PathList("org", "apache", xs @ _*) => MergeStrategy.first
>     case PathList("org", "jboss", xs @ _*) => MergeStrategy.first
>     case PathList("org", "slf4j", xs @ _*) => MergeStrategy.discard
>     case "about.html"  => MergeStrategy.rename
>     case "reference.conf" => MergeStrategy.concat
>     case _ => MergeStrategy.first
>   }
> }
>
> On Tue, Nov 11, 2014 at 3:04 PM, Mohammed Guller <mohammed@glassbeam.com>
> wrote:
>
>> Actually, it is possible to integrate Spark 1.1.0 with Play 2.2.x
>>
>> Here is a sample build.sbt file:
>>
>> name := """xyz"""
>>
>> version := "0.1 "
>>
>> scalaVersion := "2.10.4"
>>
>> libraryDependencies ++= Seq(
>>   jdbc,
>>   anorm,
>>   cache,
>>   "org.apache.spark" %% "spark-core" % "1.1.0",
>>   "com.typesafe.akka" %% "akka-actor" % "2.2.3",
>>   "com.typesafe.akka" %% "akka-slf4j" % "2.2.3",
>>   "org.apache.spark" %% "spark-sql" % "1.1.0"
>> )
>>
>> play.Project.playScalaSettings
>>
>>
>> Mohammed
>>
>> -----Original Message-----
>> From: Patrick Wendell [mailto:pwendell@gmail.com]
>> Sent: Tuesday, November 11, 2014 2:06 PM
>> To: Akshat Aranya
>> Cc: user@spark.apache.org
>> Subject: Re: Spark and Play
>>
>> Hi There,
>>
>> Because Akka versions are not binary compatible with one another, it
>> might not be possible to integrate Play with Spark 1.1.0.
>>
>> - Patrick
>>
>> On Tue, Nov 11, 2014 at 8:21 AM, Akshat Aranya <aaranya@gmail.com> wrote:
>> > Hi,
>> >
>> > Sorry if this has been asked before; I didn't find a satisfactory
>> > answer when searching.  How can I integrate a Play application with
>> > Spark?  I'm getting into issues of akka-actor versions.  Play 2.2.x
>> > uses akka-actor 2.0, whereas Play 2.3.x uses akka-actor 2.3.4, neither
>> > of which work fine with Spark 1.1.0.  Is there something I should do
>> > with libraryDependencies in my build.sbt to make it work?
>> >
>> > Thanks,
>> > Akshat
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org For additional
>> commands, e-mail: user-help@spark.apache.org
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>


-- 
Donald Szeto
PredictionIO

Mime
View raw message