spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Robert C Senkbeil <rcsen...@us.ibm.com>
Subject Re: Creating Apache Spark-powered “As Service” applications
Date Fri, 16 Jan 2015 20:20:31 GMT
Hi,

You can take a look at the Spark Kernel project:
https://github.com/ibm-et/spark-kernel

The Spark Kernel's goal is to serve as the foundation for interactive
applications. The project provides a client library in Scala that abstracts
connecting to the kernel (containing a Spark Context), which can be
embedded into a web application. We demonstrated this at StataConf when we
embedded the Spark Kernel client into a Play application to provide an
interactive web application that communicates to Spark via the Spark Kernel
(hosting a Spark Context).

A getting started section can be found here:
https://github.com/ibm-et/spark-kernel/wiki/Getting-Started-with-the-Spark-Kernel

If you have any other questions, feel free to email me or communicate over
our mailing list:

spark-kernel@googlegroups.com

https://groups.google.com/forum/#!forum/spark-kernel

Signed,
Chip Senkbeil
IBM Emerging Technology Software Engineer



From:	olegshirokikh <oleg@solver.com>
To:	user@spark.apache.org
Date:	01/16/2015 01:32 PM
Subject:	Creating Apache Spark-powered “As Service” applications



The question is about the ways to create a Windows desktop-based and/or
web-based application client that is able to connect and talk to the server
containing Spark application (either local or on-premise cloud
distributions) in the run-time.

Any language/architecture may work. So far, I've seen two things that may
be
a help in that, but I'm not so sure if they would be the best alternative
and how they work yet:

Spark Job Server - https://github.com/spark-jobserver/spark-jobserver -
defines a REST API for Spark
Hue -
http://gethue.com/get-started-with-spark-deploy-spark-server-and-compute-pi-from-your-web-browser/

- uses item 1)

Any advice would be appreciated. Simple toy example program (or steps) that
shows, e.g. how to build such client for simply creating Spark Context on a
local machine and say reading text file and returning basic stats would be
ideal answer!



--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Creating-Apache-Spark-powered-As-Service-applications-tp21193.html

Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org

Mime
View raw message