spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kyounghyun Park <kyounghyun.p...@gmail.com>
Subject [No Subject]
Date Sat, 17 Jan 2015 12:12:45 GMT
Hi,

I'm running Spark 1.2 in yarn-client mode. (using Hadoop 2.6.0)
On VirtualBox, I can run " spark-shell --master yarn-client" without any
error
However, on a physical machine,  I got the following error.

Does anyone know why this happens?
Any help would be appreciated.


Thanks,
Kyounghyun

--------------------------------------------------------------------------------------------------------------------
15/01/17 19:34:42 INFO netty.NettyBlockTransferService: Server created on
49709
15/01/17 19:34:42 INFO storage.BlockManagerMaster: Trying to register
BlockManager
15/01/17 19:34:42 INFO storage.BlockManagerMasterActor: Registering block
manager janus:49709 with 265.1 MB RAM, BlockManagerId(<driver>, janus,
49709)
15/01/17 19:34:42 INFO storage.BlockManagerMaster: Registered BlockManager
15/01/17 19:34:47 WARN remote.ReliableDeliverySupervisor: Association with
remote system [akka.tcp://sparkYarnAM@192.168.123.178:60626] has failed,
address is now gated for [5000] ms. Reason is: [Disassociated].
15/01/17 19:34:47 ERROR cluster.YarnClientSchedulerBackend: Yarn
application has already exited with state FINISHED!
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/static,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/json,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment/json,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/json,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/json,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/json,null}
15/01/17 19:34:47 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs,null}
15/01/17 19:34:47 INFO ui.SparkUI: Stopped Spark web UI at http://janus:4040
15/01/17 19:34:47 INFO scheduler.DAGScheduler: Stopping DAGScheduler
15/01/17 19:34:47 INFO cluster.YarnClientSchedulerBackend: Shutting down
all executors
15/01/17 19:34:47 INFO cluster.YarnClientSchedulerBackend: Asking each
executor to shut down
15/01/17 19:34:47 INFO cluster.YarnClientSchedulerBackend: Stopped
15/01/17 19:34:48 INFO spark.MapOutputTrackerMasterActor:
MapOutputTrackerActor stopped!
15/01/17 19:34:48 INFO storage.MemoryStore: MemoryStore cleared
15/01/17 19:34:48 INFO storage.BlockManager: BlockManager stopped
15/01/17 19:34:48 INFO storage.BlockManagerMaster: BlockManagerMaster
stopped
15/01/17 19:34:48 INFO spark.SparkContext: Successfully stopped SparkContext
15/01/17 19:34:48 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Shutting down remote daemon.
15/01/17 19:34:48 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Remote daemon shut down; proceeding with flushing remote transports.
15/01/17 19:34:48 INFO remote.RemoteActorRefProvider$RemotingTerminator:
Remoting shut down.
15/01/17 19:35:07 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend
is ready for scheduling beginning after waiting
maxRegisteredResourcesWaitingTime: 30000(ms)
java.lang.NullPointerException
at org.apache.spark.SparkContext.<init>(SparkContext.scala:497)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:986)
at $iwC$$iwC.<init>(<console>:9)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:123)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:270)
at
org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:60)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:945)
at
org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:147)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:60)
at
org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:60)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:962)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Mime
View raw message