spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "skaarthik oss" <skaarthik....@gmail.com>
Subject RE: Spark Integration Patterns
Date Mon, 29 Feb 2016 20:54:50 GMT
Check out http://toree.incubator.apache.org/. It might help with your need.

 

From: moshir mikael [mailto:moshir.mikael@gmail.com] 
Sent: Monday, February 29, 2016 5:58 AM
To: Alex Dzhagriev <dzhagr@gmail.com>
Cc: user <user@spark.apache.org>
Subject: Re: Spark Integration Patterns

 

Thanks, will check too, however : just want to use Spark core RDD and standard data sources.

 

Le lun. 29 févr. 2016 à 14:54, Alex Dzhagriev <dzhagr@gmail.com <mailto:dzhagr@gmail.com>
> a écrit :

Hi Moshir,

 

Regarding the streaming, you can take a look at the spark streaming, the micro-batching framework.
If it satisfies your needs it has a bunch of integrations. Thus, the source for the jobs could
be Kafka, Flume or Akka.

 

Cheers, Alex.

 

On Mon, Feb 29, 2016 at 2:48 PM, moshir mikael <moshir.mikael@gmail.com <mailto:moshir.mikael@gmail.com>
> wrote:

Hi Alex,

thanks for the link. Will check it.

Does someone know of a more streamlined approach ?

 

 

 

Le lun. 29 févr. 2016 à 10:28, Alex Dzhagriev <dzhagr@gmail.com <mailto:dzhagr@gmail.com>
> a écrit :

Hi Moshir,

 

I think you can use the rest api provided with Spark: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionServer.scala

 

Unfortunately, I haven't find any documentation, but it looks fine.

Thanks, Alex.

 

On Sun, Feb 28, 2016 at 3:25 PM, mms <moshir.mikael@gmail.com <mailto:moshir.mikael@gmail.com>
> wrote:

Hi, I cannot find a simple example showing how a typical application can 'connect' to a remote
spark cluster and interact with it. Let's say I have a Python web application hosted somewhere
outside a spark cluster, with just python installed on it. How can I talk to Spark without
using a notebook, or using ssh to connect to a cluster master node ? I know of spark-submit
and spark-shell, however forking a process on a remote host to execute a shell script seems
like a lot of effort What are the recommended ways to connect and query Spark from a remote
client ? Thanks Thx ! 

  _____  

View this message in context: Spark Integration Patterns <http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Integration-Patterns-tp26354.html>

Sent from the Apache Spark User List mailing list archive <http://apache-spark-user-list.1001560.n3.nabble.com/>
 at Nabble.com.

 

 


Mime
View raw message