spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From N B <nb.nos...@gmail.com>
Subject Re: Weird worker usage
Date Sat, 26 Sep 2015 03:20:04 GMT
Bryan,

By any chance, are you calling SparkConf.setMaster("local[*]") inside your
application code?

Nikunj

On Fri, Sep 25, 2015 at 9:56 AM, Bryan Jeffrey <bryan.jeffrey@gmail.com>
wrote:

> Looking at this further, it appears that my Spark Context is not correctly
> setting the Master name.  I see the following in logs:
>
> 15/09/25 16:45:42 INFO DriverRunner: Launch Command:
> "/usr/lib/jvm/java-7-openjdk-amd64/jre/bin/java" "-cp"
> "/spark/spark-1.4.1/sbin/../conf/:/spark/spark-1.4.1/assembly/target/scala-2.10/spark-assembly-1.4.1-hadoop2.2.0.jar:/spark/spark-1.4.1/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar:/spark/spark-1.4.1/lib_managed/jars/datanucleus-rdbms-3.2.9.jar:/spark/spark-1.4.1/lib_managed/jars/datanucleus-core-3.2.10.jar"
> "-Xms512M" "-Xmx512M" "-Dakka.loglevel=WARNING"
> "-Dspark.default.parallelism=6" "-Dspark.rpc.askTimeout=10" "-
> Dspark.app.name=MainClass" "-Dspark.master=spark://sparkserver:7077"
> "-Dspark.driver.supervise=true" "-Dspark.logConf=true"
> "-Dspark.jars=file:/tmp/MainClass-1.0-SNAPSHOT-jar-with-dependencies.jar"
> "-Dspark.streaming.receiver.maxRate=500" "-XX:MaxPermSize=256m"
> "org.apache.spark.deploy.worker.DriverWrapper" "akka.tcp://
> sparkWorker@10.0.0.6:48077/user/Worker"
> "/spark/spark-1.4.1/work/driver-20150925164617-0000/MainClass-1.0-SNAPSHOT-jar-with-dependencies.jar"
> "MainClass" "--checkpoint" "/tmp/sparkcheckpoint" "--broker"
> "kafkaBroker:9092" "--topic" "test" "--numStreams" "9"
> "--threadParallelism" "9"
> 15/09/25 16:45:43 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> 15/09/25 16:45:43 INFO SecurityManager: Changing view acls to: root
> 15/09/25 16:45:43 INFO SecurityManager: Changing modify acls to: root
> 15/09/25 16:45:43 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users with view permissions: Set(root); users
> with modify permissions: Set(root)
> 15/09/25 16:45:44 INFO Slf4jLogger: Slf4jLogger started
> 15/09/25 16:45:45 INFO Utils: Successfully started service 'Driver' on
> port 59670.
> 15/09/25 16:45:45 INFO WorkerWatcher: Connecting to worker akka.tcp://
> sparkWorker@10.0.0.6:48077/user/Worker
> 15/09/25 16:45:45 INFO MainClass: MainClass - Setup Logger
> 15/09/25 16:45:45 INFO WorkerWatcher: Successfully connected to akka.tcp://
> sparkWorker@10.0.0.6:48077/user/Worker
> 15/09/25 16:45:45 INFO Checkpoint: Checkpoint directory
> /tmp/sparkcheckpoint does not exist
> 15/09/25 16:45:45 INFO MainClass: Setting up streaming context with
> configuration: org.apache.spark.SparkConf@56057cbf and time window 2000 ms
> 15/09/25 16:45:45 INFO SparkContext: Running Spark version 1.4.1
> 15/09/25 16:45:45 INFO SparkContext: Spark configuration:
> spark.app.name=MainClass
> spark.default.parallelism=6
> spark.driver.supervise=true
> spark.jars=file:/tmp/OinkSpark-1.0-SNAPSHOT-jar-with-dependencies.jar
> spark.logConf=true
> spark.master=local[*]
> spark.rpc.askTimeout=10
> spark.streaming.receiver.maxRate=500
>
> As you can see, despite -Dmaster=spark://sparkserver:7077, the streaming
> context still registers the master as local[*].  Any idea why?
>
> Thank you,
>
> Bryan Jeffrey
>
>
>

Mime
View raw message