spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From MEETHU MATHEW <meethu2...@yahoo.co.in>
Subject Re: How to run multiple jobs in one sparkcontext from separate threads in pyspark?
Date Mon, 18 May 2015 06:35:41 GMT
Hi Akhil, The python wrapper for Spark Job Server did not help me. I actually need the pyspark
code sample  which shows how  I can call a function from 2 threads and execute it simultaneously. Thanks
& Regards,
Meethu M 


     On Thursday, 14 May 2015 12:38 PM, Akhil Das <akhil@sigmoidanalytics.com> wrote:
   

 Did you happened to have a look at the spark job server? Someone wrote a python wrapper
around it, give it a try.
ThanksBest Regards
On Thu, May 14, 2015 at 11:10 AM, MEETHU MATHEW <meethu2006@yahoo.co.in> wrote:

Hi all,
 Quote "Inside a given Spark application (SparkContext instance), multiple parallel jobs
can run simultaneously if they were submitted from separate threads. " 
How to run multiple jobs in one SPARKCONTEXT using separate threads in pyspark? I found some
examples in scala and java, but couldn't find python code. Can anyone help me with a pyspark
example? 
Thanks & Regards,
Meethu M



  
Mime
View raw message