spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Robert C Senkbeil <rcsen...@us.ibm.com>
Subject Re: Web Service + Spark
Date Mon, 12 Jan 2015 20:45:45 GMT
If you would like to work with an API, you can use the Spark Kernel found
here: https://github.com/ibm-et/spark-kernel

The kernel provides an API following the IPython message protocol as well
as a client library that can be used with Scala applications.

The kernel can also be plugged into the latest developmental version of
IPython 3.0 in case you want to do more visual exploration.

Signed,
Chip Senkbeil
IBM Emerging Technology Software Engineer



From:	Raghavendra Pandey <raghavendra.pandey@gmail.com>
To:	Cui Lin <cui.lin@hds.com>, gtinside <gtinside@gmail.com>, Corey
            Nolet <cjnolet@gmail.com>
Cc:	"user@spark.apache.org" <user@spark.apache.org>
Date:	01/11/2015 02:06 AM
Subject:	Re: Web Service + Spark



You can take a look at http://zeppelin.incubator.apache.org. it is a
notebook and graphic visual designer.



On Sun, Jan 11, 2015, 01:45 Cui Lin <cui.lin@hds.com> wrote:
  Thanks, Gaurav and Corey,

  Probably I didn’t make myself clear. I am looking for best Spark practice
  similar to Shiny for R, the analysis/visualziation results can be easily
  published to web server and shown from web browser. Or any dashboard for
  Spark?

  Best regards,

  Cui Lin

  From: gtinside <gtinside@gmail.com>
  Date: Friday, January 9, 2015 at 7:45 PM
  To: Corey Nolet <cjnolet@gmail.com>
  Cc: Cui Lin <Cui.Lin@hds.com>, "user@spark.apache.org" <
  user@spark.apache.org>
  Subject: Re: Web Service + Spark

  You can also look at Spark Job Server
  https://github.com/spark-jobserver/spark-jobserver

  - Gaurav

  On Jan 9, 2015, at 10:25 PM, Corey Nolet <cjnolet@gmail.com> wrote:

        Cui Lin,

        The solution largely depends on how you want your services deployed
        (Java web container, Spray framework, etc...) and if you are using
        a cluster manager like Yarn or Mesos vs. just firing up your own
        executors and master.

        I recently worked on an example for deploying Spark services inside
        of Jetty using Yarn as the cluster manager. It forced me to learn
        how Spark wires up the dependencies/classpaths. If it helps, the
        example that resulted from my tinkering is located at [1].


        [1] https://github.com/calrissian/spark-jetty-server

        On Fri, Jan 9, 2015 at 9:33 PM, Cui Lin <cui.lin@hds.com> wrote:
         Hello, All,

         What’s the best practice on deploying/publishing spark-based
         scientific applications into a web service? Similar to Shiny on R.
          Thanks!

         Best regards,

         Cui Lin
Mime
View raw message