spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "" <>
Subject multiple pyspark instances simultaneously (same time)
Date Thu, 15 Oct 2015 18:01:29 GMT
I am having issues trying to setup spark to run jobs simultaneously.

I thought I wanted FAIR scheduling?

I used the templated fairscheduler.xml as is when I start pyspark I see the
3 expected pools:
production, test, and default

when I login as second user and run pyspark
I see the expected pools as that user as well

when I open up a webbrowser to http://master:8080

I see my first user's state is running and my second user's state is waiting

so I try putting them both in the production pool which is fair scheduler

When I refresh http://master:8080

the second user's status is still waiting.

If I try to run something as the second user I get

"Initial job has not accepted any resources"

Maybe fair queuing is not what I want?

I'm starting pyspark as follows

pyspark --master spark://master:7077

I started spark as follows

View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message