spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ayan guha <guha.a...@gmail.com>
Subject Re: How to run multiple jobs in one sparkcontext from separate threads in pyspark?
Date Mon, 18 May 2015 07:34:46 GMT
Hi

So to be clear, do you want to run one operation in multiple threads within
a function or you want run multiple jobs using multiple threads? I am
wondering why python thread module can't be used? Or you have already gave
it a try?
On 18 May 2015 16:39, "MEETHU MATHEW" <meethu2006@yahoo.co.in> wrote:

> Hi Akhil,
>
> The python wrapper for Spark Job Server did not help me. I actually need
> the pyspark code sample  which shows how  I can call a function from 2
> threads and execute it simultaneously.
>
> Thanks & Regards,
> Meethu M
>
>
>
>   On Thursday, 14 May 2015 12:38 PM, Akhil Das <akhil@sigmoidanalytics.com>
> wrote:
>
>
> Did you happened to have a look at the spark job server?
> <https://github.com/ooyala/spark-jobserver> Someone wrote a python wrapper
> <https://github.com/wangqiang8511/spark_job_manager> around it, give it a
> try.
>
> Thanks
> Best Regards
>
> On Thu, May 14, 2015 at 11:10 AM, MEETHU MATHEW <meethu2006@yahoo.co.in>
> wrote:
>
> Hi all,
>
>  Quote
>  "Inside a given Spark application (SparkContext instance), multiple
> parallel jobs can run simultaneously if they were submitted from separate
> threads. "
>
> How to run multiple jobs in one SPARKCONTEXT using separate threads in
> pyspark? I found some examples in scala and java, but couldn't find python
> code. Can anyone help me with a* pyspark example*?
>
> Thanks & Regards,
> Meethu M
>
>
>
>
>

Mime
View raw message