spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Davies Liu <dav...@databricks.com>
Subject Re: How to run multiple jobs in one sparkcontext from separate threads in pyspark?
Date Mon, 18 May 2015 21:12:52 GMT
SparkContext can be used in multiple threads (Spark streaming works
with multiple threads), for example:

import threading
import time

def show(x):
     time.sleep(1)
     print x

def job():
     sc.parallelize(range(100)).foreach(show)

threading.Thread(target=job).start()


On Mon, May 18, 2015 at 12:34 AM, ayan guha <guha.ayan@gmail.com> wrote:
> Hi
>
> So to be clear, do you want to run one operation in multiple threads within
> a function or you want run multiple jobs using multiple threads? I am
> wondering why python thread module can't be used? Or you have already gave
> it a try?
>
> On 18 May 2015 16:39, "MEETHU MATHEW" <meethu2006@yahoo.co.in> wrote:
>>
>> Hi Akhil,
>>
>> The python wrapper for Spark Job Server did not help me. I actually need
>> the pyspark code sample  which shows how  I can call a function from 2
>> threads and execute it simultaneously.
>>
>> Thanks & Regards,
>> Meethu M
>>
>>
>>
>> On Thursday, 14 May 2015 12:38 PM, Akhil Das <akhil@sigmoidanalytics.com>
>> wrote:
>>
>>
>> Did you happened to have a look at the spark job server? Someone wrote a
>> python wrapper around it, give it a try.
>>
>> Thanks
>> Best Regards
>>
>> On Thu, May 14, 2015 at 11:10 AM, MEETHU MATHEW <meethu2006@yahoo.co.in>
>> wrote:
>>
>> Hi all,
>>
>>  Quote
>>  "Inside a given Spark application (SparkContext instance), multiple
>> parallel jobs can run simultaneously if they were submitted from separate
>> threads. "
>>
>> How to run multiple jobs in one SPARKCONTEXT using separate threads in
>> pyspark? I found some examples in scala and java, but couldn't find python
>> code. Can anyone help me with a pyspark example?
>>
>> Thanks & Regards,
>> Meethu M
>>
>>
>>
>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message