spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 林武康 <vboylin1...@gmail.com>
Subject Can SparkContext shared across nodes/drivers
Date Sun, 21 Sep 2014 16:21:12 GMT
Hi all,
So far as I known, a SparkContext instance take in charge of some resources of a cluster the
master assigned to.  And It is hardly shared with different sparkcontexts. meanwhile, schedule
between applications is also not easier.
To address this without introducing extra resource schedule system such as yarn/mesos, I suppose
to create a special SparkContext that can be shared across nodes/drivers, that is,  submitting
jobs from different nodes, but share same rdd definition and task-scheduler. 
Is this idea valuable? Is this possible to implemented? or it is value of nothing?
 
Thanks for any advices.
 
lin wukang
Mime
View raw message