So to be clear, do you want to run one operation in multiple threads within a function or you want run multiple jobs using multiple threads? I am wondering why python thread module can't be used? Or you have already gave it a try?
Hi Akhil,The python wrapper for Spark Job Server did not help me. I actually need the pyspark code sample which shows how I can call a function from 2 threads and execute it simultaneously.Thanks & Regards,
Meethu MOn Thursday, 14 May 2015 12:38 PM, Akhil Das <email@example.com> wrote:
ThanksBest RegardsOn Thu, May 14, 2015 at 11:10 AM, MEETHU MATHEW <firstname.lastname@example.org> wrote:Hi all,Quote"Inside a given Spark application (SparkContext instance), multiple parallel jobs can run simultaneously if they were submitted from separate threads. "How to run multiple jobs in one SPARKCONTEXT using separate threads in pyspark? I found some examples in scala and java, but couldn't find python code. Can anyone help me with a pyspark example?Thanks & Regards,