spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Robert C Senkbeil <>
Subject Re: IBM open-sources Spark Kernel
Date Fri, 12 Dec 2014 23:29:15 GMT

Hi Sam,

We developed the Spark Kernel with a focus on the newest version of the
IPython message protocol (5.0) for the upcoming IPython 3.0 release.

We are building around Apache Spark's REPL, which is used in the current
Spark Shell implementation.

The Spark Kernel was designed to be extensible through magics (,
providing functionality that might be needed outside the Scala interpreter.

Finally, a big part of our focus is on application development. Because of
this, we are providing a client library for applications to connect to the
Spark Kernel without needing to implement the ZeroMQ protocol.

Chip Senkbeil

From:	Sam Bessalah <>
To:	Robert C Senkbeil/Austin/IBM@IBMUS
Date:	12/12/2014 04:20 PM
Subject:	Re: IBM open-sources Spark Kernel

Wow. Thanks. Can't wait to try this out.
Great job.
How Is it different from Iscala or Ispark?

On Dec 12, 2014 11:17 PM, "Robert C Senkbeil" <> wrote:

  We are happy to announce a developer preview of the Spark Kernel which
  enables remote applications to dynamically interact with Spark. You can
  think of the Spark Kernel as a remote Spark Shell that uses the IPython
  notebook interface to provide a common entrypoint for any application.
  Spark Kernel obviates the need to submit jars using spark-submit, and can
  replace the existing Spark Shell.

  You can try out the Spark Kernel today by installing it from our github
  repo at To help you get a demo
  environment up and running quickly, the repository also includes a
  Dockerfile and a Vagrantfile to build a Spark Kernel container and
  to it from an IPython notebook.

  We have included a number of documents with the project to help explain
  and provide how-to information:

  * A high-level overview of the Spark Kernel and its client library (


  * README ( -
  building and testing the kernel, and deployment options including
  the Docker container and packaging the kernel.

  * IPython instructions ( -
  setting up the development version of IPython and connecting a Spark

  * Client library tutorial ( -
  building and using the client library to connect to a Spark Kernel.

  * Magics documentation ( - the
  magics in the kernel and how to write your own.

  We think the Spark Kernel will be useful for developing applications for
  Spark, and we are making it available with the intention of improving
  capabilities within the context of the Spark community ( We will continue to
  develop the codebase and welcome your comments and suggestions.


  Chip Senkbeil
  IBM Emerging Technology Software Engineer
View raw message