spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Niranda Perera <niranda.per...@gmail.com>
Subject Deploying master and worker programatically in java
Date Tue, 03 Mar 2015 10:43:13 GMT
Hi,

I want to start a Spark standalone cluster programatically in java.

I have been checking these classes,
- org.apache.spark.deploy.master.Master
- org.apache.spark.deploy.worker.Worker

I successfully started a master with this simple main class.

 public static void main(String[] args) {
        SparkConf conf = new SparkConf();
        Master.startSystemAndActor("localhost", 4500, 8080, conf);
}


but I'm finding it hard to carry out a similar approach for the worker.

can anyone give an example of how to pass a value to the workerNumber field
in the Worker.startSystemAndActor constructor (in the java env)?

Cheers
-- 
Niranda

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message