spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From John Yost <>
Subject two spark-shells spark on mesos not working
Date Tue, 22 Nov 2016 12:52:44 GMT
Hi Everyone,

There is probably an obvious answer to this, but not sure what it is. :)

I am attempting to launch 2..n spark shells using Mesos as the master (this
is to support 1..n researchers running pyspark stuff on our data). I can
launch two or more spark shells without any problem. But, when I attempt
any kind of operation that requires a Spark executor outside the driver
program such as:

val numbers = Ranger(1,1000)
val pNumbers = sc.parallelize(numbers)

I get the dreaded message:
TaskSchedulerImpl: Initial job has not accepted any resources; check your
cluster UI to ensure that workers are registered and sufficient resources

I confirmed that both spark shells are listed as separate, uniquely-named
Mesos frameworks and that there are plenty of CPU core and memory resources
on our cluster.

I am using Spark 2.0.1 on Mesos 0.28.1. Any ideas that y'all may have would
be very much appreciated.

Thanks! :)


View raw message