spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From kundan kumar <iitr.kun...@gmail.com>
Subject ERROR UserGroupInformation: PriviledgedActionException in standalone mode
Date Fri, 16 Jan 2015 08:55:33 GMT
Hi,

I am new to spark and trying to install spark in a standalone mode.

When I start the spark shell using

./bin/spark-shell --master spark://192.168.1.225:7077

I am using spark version 1.2.0

My spark-env.sh file loks like this

export STANDALONE_SPARK_MASTER_HOST=essex-spark1
export SPARK_MASTER_IP=192.168.1.225
export SPARK_MASTER_PORT=7077
export MASTER=spark://${SPARK_MASTER_IP}:{$SPARK_MASTER_PORT}
export SPARK_LOCAL_IP=192.168.1.225


Please help me in fixing this error.

The master UI looks like this

[image: Inline image 1]


The error logs are

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/01/16 14:08:38 INFO CoarseGrainedExecutorBackend: Registered signal
handlers for [TERM, HUP, INT]
15/01/16 14:08:38 INFO SecurityManager: Changing view acls to: spuser
15/01/16 14:08:38 INFO SecurityManager: Changing modify acls to: spuser
15/01/16 14:08:38 INFO SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view
permissions: Set(spuser); users with modify permissions: Set(spuser)
15/01/16 14:08:38 INFO Slf4jLogger: Slf4jLogger started
15/01/16 14:08:38 INFO Remoting: Starting remoting
15/01/16 14:08:38 INFO Remoting: Remoting started; listening on
addresses :[akka.tcp://driverPropsFetcher@essex-spark2:54017]
15/01/16 14:08:38 INFO Utils: Successfully started service
'driverPropsFetcher' on port 54017.
15/01/16 14:08:38 WARN Remoting: Tried to associate with unreachable
remote address [akka.tcp://sparkDriver@essex-spark1:52856]. Address is
now gated for 5000 ms, all messages to this address will be delivered
to dead letters. Reason: Connection refused:
essex-spark1/192.168.1.225:52856
15/01/16 14:09:08 ERROR UserGroupInformation:
PriviledgedActionException as:spuser
cause:java.util.concurrent.TimeoutException: Futures timed out after
[30 seconds]
Exception in thread "main"
java.lang.reflect.UndeclaredThrowableException: Unknown exception in
doAs
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1134)
	at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:59)
	at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:115)
	at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:163)
	at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
Caused by: java.security.PrivilegedActionException:
java.util.concurrent.TimeoutException: Futures timed out after [30
seconds]
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
	... 4 more
Caused by: java.util.concurrent.TimeoutException: Futures timed out
after [30 seconds]
	at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
	at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
	at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
	at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
	at scala.concurrent.Await$.result(package.scala:107)
	at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:127)
	at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:60)
	at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:59)
	... 7 more

Mime
View raw message