hadoop-mapreduce-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Haoyuan Li <haoyuan...@gmail.com>
Subject Re: Can't find main class when run ResourceManager or NodeManager
Date Fri, 12 Aug 2011 05:31:26 GMT
Preveen,

Thank you for your reply. The CLASSPATH generated by the
'/trunk/hadoop-common/src/main/bin/hadoop' and 'hadoop-config.sh' in my
machine is a mess.

>From my understanding, 'hadoop-config.sh' still generates the CLASSPATH
based on the build structure from ant? While now we are using mvn. Could you
please tell me the location of your 'hadoop' and 'hadoop-config.sh' scripts?
Or maybe you could send me a copy of your hadoop-config.sh, which I suppose
should be the same as mine. Not sure whether these could be the reason or
not...

Any help will be appreciated!

Best,

Haoyuan

On Thu, Aug 11, 2011 at 7:32 PM, Praveen Sripati
<praveensripati@gmail.com>wrote:

> Hoayuan,
>
> RunJar is in the hadoop-common-0.23.0-SNAPSHOT.jar. Do a 'hadoop classpath'
> and check if the jar file is there in the classpath location.
>
> Similarly, running 'yarn classpath' will provide the classpath for running
> the yarn daemons (RM, NM and HS).
>
> Thanks,
> Praveen
>
> On Fri, Aug 12, 2011 at 5:18 AM, Haoyuan Li <haoyuan.li@gmail.com> wrote:
>
> > Hi Mahadev,
> >
> > This works for me. Thank you.
> >
> > However, another issue came to me when I tried to run an example.
> >
> > Here
> >
> >
> http://svn.apache.org/repos/asf/hadoop/common/branches/MR-279/mapreduce/INSTALL
> > ,
> > it says:
> >
> > $HADOOP_COMMON_HOME/bin/hadoop jar
> > $HADOOP_MAPRED_HOME/build/hadoop-mapred-examples-0.22.0-SNAPSHOT.jar
> > randomwriter -Dmapreduce.job.user.name=$USER
> > -Dmapreduce.clientfactory.class.name
> > =org.apache.hadoop.mapred.YarnClientFactory
> > -Dmapreduce.randomwriter.bytespermap=10000 -Ddfs.blocksize=536870912
> > -Ddfs.block.size=536870912 -libjars
> >
> >
> $HADOOP_YARN_INSTALL/hadoop-mapreduce-1.0-SNAPSHOT/modules/hadoop-mapreduce-client-jobclient-1.0-SNAPSHOT.jar
> > output
> >
> > However, there is no /bin folder in $HADOOP_COMMON_HOME. I found
> > /bin/hadoop
> > in $HADOOP_COMMON_HOME/src/main/bin/hadoop. When I executed the command:
> >
> > ./hadoop-common/src/main/bin/hadoop jar
> >
> >
> /home/haoyuan/hadoop/trunk/mapreduce/build/hadoop-mapred-examples-0.23.0-SNAPSHOT.jar
> > randomwriter -Dmapreduce.job.user.name=haoyuan -
> > Dmapreduce.clientfactory.class.name
> > =org.apache.hadoop.mapred.YarnClientFactory
> > -Dmapreduce.randomwriter.bytespermap=10000 -Ddfs.blocksize=536870912
> > -Ddfs.block.size=536870912 -libjars
> >
> >
> /home/haoyuan/hadoop/trunk/hadoop-mapreduce-1.0-SNAPSHOT/modules/hadoop-mapreduce-client-jobclient-1.0-SNAPSHOT.jar
> > output
> >
> > I got another exception:
> > ===========================================
> > haoyuan@hya:~/hadoop/trunk$ ./hadoop-common/src/main/bin/hadoop jar
> >
> >
> /home/haoyuan/hadoop/trunk/mapreduce/build/hadoop-mapred-examples-0.23.0-SNAPSHOT.jar
> > randomwriter -Dmapreduce.job.user.name=haoyuan -
> > Dmapreduce.clientfactory.class.name
> > =org.apache.hadoop.mapred.YarnClientFactory
> > -Dmapreduce.randomwriter.bytespermap=10000 -Ddfs.blocksize=536870912
> > -Ddfs.block.size=536870912 -libjars
> >
> >
> /home/haoyuan/hadoop/trunk/hadoop-mapreduce-1.0-SNAPSHOT/modules/hadoop-mapreduce-client-jobclient-1.0-SNAPSHOT.jar
> > output
> > Exception in thread "main" java.lang.NoClassDefFoundError:
> > org/apache/hadoop/util/RunJar
> > Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.util.RunJar
> > at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > at java.security.AccessController.doPrivileged(Native Method)
> > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > Could not find the main class: org.apache.hadoop.util.RunJar.  Program
> will
> > exit.
> > ===========================================
> >
> > Any clue?
> >
> > Thank you.
> >
> > Best,
> >
> > Haoyuan
> >
> >
> >
> >
> >
> >
> >
> > On Thu, Aug 11, 2011 at 3:47 PM, Mahadev Konar <mahadev@hortonworks.com
> > >wrote:
> >
> > > Haoyuan,
> > >  Thats an issue with having avro 1.3.2 jar in the classpath. Please
> > > see if you have that version of 1.3.2 jar in the classpath and remove
> > > it. We use 1.4 avro jar which isnt compatible with 1.3.2.
> > >
> > > hope that helps!
> > > mahadev
> > >
> > > On Thu, Aug 11, 2011 at 3:36 PM, Haoyuan Li <haoyuan.li@gmail.com>
> > wrote:
> > > > I downloaded the latest version: r1156719. and then redid from the
> > > scratch.
> > > > Now, the nodemanager and run well. However, I got into the same state
> > as
> > > > Matei when I tried to run ResourceManager.  Any help will be
> > appreciated.
> > > > Best,
> > > > Haoyuan
> > > > The followings are output.
> > > > ============ The output from .out log file =================
> > > > Exception in thread "main" java.lang.IllegalStateException: For this
> > > > operation, current State must be STARTED instead of INITED
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.yarn.service.AbstractService.ensureCurrentState(AbstractService.java:101)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.yarn.service.AbstractService.stop(AbstractService.java:69)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.amlauncher.ApplicationMasterLauncher.stop(ApplicationMasterLauncher.java:90)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.yarn.service.CompositeService.stop(CompositeService.java:89)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.stop(ResourceManager.java:423)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:495)
> > > > =====================================================
> > > > =========== the output from .log log file ===================
> > > > 2011-08-11 15:25:38,357 INFO
> > > > org.apache.hadoop.yarn.server.resourcemanager.ResourceManager:
> Resource
> > > > Manager is starting...
> > > > 2011-08-11 15:25:38,878 INFO
> > > org.apache.hadoop.yarn.event.AsyncDispatcher:
> > > > Registering class
> > > >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.scheduler.event.SchedulerEventType
> > > > for class
> > > >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$SchedulerEventDispatcher
> > > > 2011-08-11 15:25:38,879 INFO
> > > org.apache.hadoop.yarn.event.AsyncDispatcher:
> > > > Registering class
> > > > org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppEventType
> for
> > > class
> > > >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$ApplicationEventDispatcher
> > > > 2011-08-11 15:25:38,880 INFO
> > > org.apache.hadoop.yarn.event.AsyncDispatcher:
> > > > Registering class
> > > >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.rmapp.attempt.RMAppAttemptEventType
> > > > for class
> > > >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$ApplicationAttemptEventDispatcher
> > > > 2011-08-11 15:25:38,881 INFO
> > > org.apache.hadoop.yarn.event.AsyncDispatcher:
> > > > Registering class
> > > > org.apache.hadoop.yarn.server.resourcemanager.rmnode.RMNodeEventType
> > for
> > > > class
> > > >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$NodeEventDispatcher
> > > > 2011-08-11 15:25:38,904 INFO
> > > org.apache.hadoop.yarn.event.AsyncDispatcher:
> > > > Registering class
> > > >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.amlauncher.AMLauncherEventType
> > > > for class
> > > >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.amlauncher.ApplicationMasterLauncher
> > > > 2011-08-11 15:25:38,907 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > > Service:Dispatcher is inited.
> > > > 2011-08-11 15:25:38,907 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > >
> > >
> >
> Service:org.apache.hadoop.yarn.server.resourcemanager.rmcontainer.ContainerAllocationExpirer
> > > > is inited.
> > > > 2011-08-11 15:25:38,935 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > > Service:AMLivelinessMonitor is inited.
> > > > 2011-08-11 15:25:38,936 INFO org.apache.hadoop.util.HostsFileReader:
> > > > Refreshing hosts (include/exclude) list
> > > > 2011-08-11 15:25:38,936 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > >
> Service:org.apache.hadoop.yarn.server.resourcemanager.NodesListManager
> > is
> > > > inited.
> > > > 2011-08-11 15:25:38,936 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > >
> > >
> >
> Service:org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$SchedulerEventDispatcher
> > > > is inited.
> > > > 2011-08-11 15:25:38,936 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > > Service:NMLivelinessMonitor is inited.
> > > > 2011-08-11 15:25:38,940 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > >
> > >
> >
> Service:org.apache.hadoop.yarn.server.resourcemanager.ResourceTrackerService
> > > > is inited.
> > > > 2011-08-11 15:25:38,941 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > >
> > >
> >
> Service:org.apache.hadoop.yarn.server.resourcemanager.ApplicationMasterService
> > > > is inited.
> > > > 2011-08-11 15:25:38,941 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > > Service:org.apache.hadoop.yarn.server.resourcemanager.ClientRMService
> > is
> > > > inited.
> > > > 2011-08-11 15:25:38,941 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > > Service:org.apache.hadoop.yarn.server.resourcemanager.AdminService is
> > > > inited.
> > > > 2011-08-11 15:25:38,965 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > >
> > >
> >
> Service:org.apache.hadoop.yarn.server.resourcemanager.amlauncher.ApplicationMasterLauncher
> > > > is inited.
> > > > 2011-08-11 15:25:38,965 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > > Service:ResourceManager is inited.
> > > > 2011-08-11 15:25:39,057 INFO org.mortbay.log: Logging to
> > > > org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
> > > > org.mortbay.log.Slf4jLog
> > > > 2011-08-11 15:25:39,125 INFO org.apache.hadoop.http.HttpServer: Added
> > > global
> > > > filter 'safety'
> > > (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
> > > > 2011-08-11 15:25:39,128 INFO org.apache.hadoop.http.HttpServer: Added
> > > filter
> > > > static_user_filter
> > > >
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter)
> > > to
> > > > context WepAppsContext
> > > > 2011-08-11 15:25:39,128 INFO org.apache.hadoop.http.HttpServer: Added
> > > filter
> > > > static_user_filter
> > > >
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter)
> > > to
> > > > context static
> > > > 2011-08-11 15:25:39,128 INFO org.apache.hadoop.http.HttpServer: Added
> > > filter
> > > > static_user_filter
> > > >
> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter)
> > > to
> > > > context logs
> > > > 2011-08-11 15:25:39,131 INFO org.apache.hadoop.http.HttpServer: Added
> > > global
> > > > filter 'guice' (class=com.google.inject.servlet.GuiceFilter)
> > > > 2011-08-11 15:25:39,131 INFO org.apache.hadoop.http.HttpServer: Port
> > > > returned by webServer.getConnectors()[0].getLocalPort() before open()
> > is
> > > -1.
> > > > Opening the listener on 8088
> > > > 2011-08-11 15:25:39,136 INFO org.apache.hadoop.http.HttpServer:
> > > > listener.getLocalPort() returned 8088
> > > > webServer.getConnectors()[0].getLocalPort() returned 8088
> > > > 2011-08-11 15:25:39,136 INFO org.apache.hadoop.http.HttpServer: Jetty
> > > bound
> > > > to port 8088
> > > > 2011-08-11 15:25:39,136 INFO org.mortbay.log: jetty-6.1.26
> > > > 2011-08-11 15:25:39,160 INFO org.mortbay.log: Extract
> > > >
> > >
> >
> jar:file:/home/haoyuan/hadoop/trunk/hadoop-mapreduce-1.0-SNAPSHOT/modules/yarn-common-1.0-SNAPSHOT.jar!/webapps/yarn
> > > > to /tmp/Jetty_0_0_0_0_8088_yarn____yzuv81/webapp
> > > > 2011-08-11 15:25:39,238 INFO org.mortbay.log: NO JSP Support for /,
> did
> > > not
> > > > find org.apache.jasper.servlet.JspServlet
> > > > 2011-08-11 15:25:39,319 INFO org.mortbay.log: Started
> > > > SelectChannelConnector@0.0.0.0:8088
> > > > 2011-08-11 15:25:39,320 INFO org.apache.hadoop.yarn.webapp.WebApps:
> Web
> > > app
> > > > /yarn started at 8088
> > > > 2011-08-11 15:25:39,568 INFO org.apache.hadoop.yarn.webapp.WebApps:
> > > > Registered webapp guice modules
> > > > 2011-08-11 15:25:39,589 WARN
> > > org.apache.hadoop.metrics2.impl.MetricsConfig:
> > > > Cannot locate configuration: tried
> > > > hadoop-metrics2-resourcemanager.properties,hadoop-metrics2.properties
> > > > 2011-08-11 15:25:39,626 INFO
> > > > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
> > > period
> > > > at 10 second(s).
> > > > 2011-08-11 15:25:39,626 INFO
> > > > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: ResourceManager
> > > metrics
> > > > system started
> > > > 2011-08-11 15:25:39,627 INFO
> > > > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Registered source
> > > > QueueMetrics,q0=default
> > > > 2011-08-11 15:25:39,627 INFO
> > > > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Registered source
> > > > UgiMetrics
> > > > 2011-08-11 15:25:39,628 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > > Service:Dispatcher is started.
> > > > 2011-08-11 15:25:39,629 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > >
> > >
> >
> Service:org.apache.hadoop.yarn.server.resourcemanager.rmcontainer.ContainerAllocationExpirer
> > > > is started.
> > > > 2011-08-11 15:25:39,629 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > > Service:AMLivelinessMonitor is started.
> > > > 2011-08-11 15:25:39,629 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > >
> Service:org.apache.hadoop.yarn.server.resourcemanager.NodesListManager
> > is
> > > > started.
> > > > 2011-08-11 15:25:39,629 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > >
> > >
> >
> Service:org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$SchedulerEventDispatcher
> > > > is started.
> > > > 2011-08-11 15:25:39,629 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > > Service:NMLivelinessMonitor is started.
> > > > 2011-08-11 15:25:39,629 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > >
> > >
> >
> Service:org.apache.hadoop.yarn.server.resourcemanager.ResourceTrackerService
> > > > is started.
> > > > 2011-08-11 15:25:39,630 INFO org.apache.hadoop.yarn.ipc.YarnRPC:
> > Creating
> > > > YarnRPC for null
> > > > 2011-08-11 15:25:39,631 INFO
> org.apache.hadoop.yarn.ipc.HadoopYarnRPC:
> > > > Creating a HadoopYarnProtoRpc server for protocol interface
> > > > org.apache.hadoop.yarn.server.api.ResourceTracker with 10 handlers
> > > > 2011-08-11 15:25:39,631 INFO
> org.apache.hadoop.yarn.ipc.HadoopYarnRPC:
> > > > Configured SecurityInfo class name is
> > > > org.apache.hadoop.yarn.server.RMNMSecurityInfoClass
> > > > 2011-08-11 15:25:44,651 INFO org.apache.hadoop.ipc.Server: Starting
> > > Socket
> > > > Reader #1 for port 8025
> > > > 2011-08-11 15:25:44,663 INFO
> > > > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Registered source
> > > > RpcActivityForPort8025
> > > > 2011-08-11 15:25:44,668 INFO
> > > > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Registered source
> > > > RpcDetailedActivityForPort8025
> > > > 2011-08-11 15:25:44,670 ERROR
> > > > org.apache.hadoop.yarn.service.CompositeService: Error starting
> > services
> > > > ResourceManager
> > > > java.lang.NoSuchMethodError: org.apache.avro.ipc.Server.start()V
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.ResourceTrackerService.start(ResourceTrackerService.java:128)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.yarn.service.CompositeService.start(CompositeService.java:68)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.start(ResourceManager.java:392)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:491)
> > > > 2011-08-11 15:25:44,670 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > > Service:NMLivelinessMonitor is stopped.
> > > > 2011-08-11 15:25:44,671 INFO
> > > > org.apache.hadoop.yarn.util.AbstractLivelinessMonitor:
> > > NMLivelinessMonitor
> > > > thread interrupted
> > > > 2011-08-11 15:25:44,671 ERROR
> > > > org.apache.hadoop.yarn.server.resourcemanager.ResourceManager:
> > Returning,
> > > > interrupted : java.lang.InterruptedException
> > > > 2011-08-11 15:25:44,671 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > >
> > >
> >
> Service:org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$SchedulerEventDispatcher
> > > > is stopped.
> > > > 2011-08-11 15:25:44,671 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > >
> Service:org.apache.hadoop.yarn.server.resourcemanager.NodesListManager
> > is
> > > > stopped.
> > > > 2011-08-11 15:25:44,671 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > > Service:AMLivelinessMonitor is stopped.
> > > > 2011-08-11 15:25:44,671 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > >
> > >
> >
> Service:org.apache.hadoop.yarn.server.resourcemanager.rmcontainer.ContainerAllocationExpirer
> > > > is stopped.
> > > > 2011-08-11 15:25:44,671 INFO
> > > > org.apache.hadoop.yarn.util.AbstractLivelinessMonitor:
> > > AMLivelinessMonitor
> > > > thread interrupted
> > > > 2011-08-11 15:25:44,671 INFO
> > > > org.apache.hadoop.yarn.util.AbstractLivelinessMonitor:
> > > >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.rmcontainer.ContainerAllocationExpirer
> > > > thread interrupted
> > > > 2011-08-11 15:25:44,671 INFO
> > > org.apache.hadoop.yarn.event.AsyncDispatcher:
> > > > AsyncDispatcher thread interrupted
> > > > java.lang.InterruptedException
> > > > at
> > > >
> > >
> >
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:1961)
> > > > at
> > > >
> > >
> >
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:1996)
> > > > at
> > > >
> > >
> >
> java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:399)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.yarn.event.AsyncDispatcher$1.run(AsyncDispatcher.java:69)
> > > > at java.lang.Thread.run(Thread.java:662)
> > > > 2011-08-11 15:25:44,672 INFO
> > > org.apache.hadoop.yarn.service.AbstractService:
> > > > Service:Dispatcher is stopped.
> > > > 2011-08-11 15:25:44,672 ERROR
> > > > org.apache.hadoop.yarn.server.resourcemanager.ResourceManager: Error
> > > > starting RM
> > > > org.apache.hadoop.yarn.YarnException: Failed to Start ResourceManager
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.yarn.service.CompositeService.start(CompositeService.java:80)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.start(ResourceManager.java:392)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:491)
> > > > Caused by: java.lang.NoSuchMethodError:
> > > org.apache.avro.ipc.Server.start()V
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.ResourceTrackerService.start(ResourceTrackerService.java:128)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.yarn.service.CompositeService.start(CompositeService.java:68)
> > > > ... 2 more
> > > > 2011-08-11 15:25:44,673 INFO org.mortbay.log: Stopped
> > > > SelectChannelConnector@0.0.0.0:8088
> > > > 2011-08-11 15:25:44,775 INFO
> > > > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping
> > > ResourceManager
> > > > metrics system...
> > > > 2011-08-11 15:25:44,776 INFO
> > > > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping metrics
> > > source
> > > > QueueMetrics,q0=default
> > > > 2011-08-11 15:25:44,776 INFO
> > > > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping metrics
> > > source
> > > > UgiMetrics
> > > > 2011-08-11 15:25:44,776 INFO
> > > > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping metrics
> > > source
> > > > RpcActivityForPort8025
> > > > 2011-08-11 15:25:44,776 INFO
> > > > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping metrics
> > > source
> > > > RpcDetailedActivityForPort8025
> > > > 2011-08-11 15:25:44,777 INFO
> > > > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: ResourceManager
> > > metrics
> > > > system stopped.
> > > > 2011-08-11 15:25:44,777 INFO
> > > > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: ResourceManager
> > > metrics
> > > > system shutdown complete.
> > > > =====================================================
> > > > On Thu, Aug 11, 2011 at 2:07 PM, Mahadev Konar <
> > mahadev@hortonworks.com>
> > > > wrote:
> > > >>
> > > >> Matei,
> > > >>  Are you sure you are using the latest common from trunk?  The
> > > ClientCache
> > > >> was recently added within the last few weeks.
> > > >>  Looks like its using some old version of hadoop common.
> > > >>
> > > >>  Also the sl4j errors should be fixed in the latest MR-279 branch.
> > > >>
> > > >> thanks
> > > >> mahadev
> > > >>
> > > >> On Aug 11, 2011, at 1:52 PM, Matei Zaharia wrote:
> > > >>
> > > >> > I get a similar error on Mac OS X. I've built YARN and extracted
> the
> > > >> > tarball to a directory, but when I run bin/yarn-daemon.sh start
> > > nodemanager,
> > > >> > it prints the following to its log (apologies for the long trace):
> > > >> >
> > > >> > log4j:WARN No appenders could be found for logger
> > > >> > (org.apache.hadoop.metrics2.impl.MetricsSystemImpl).
> > > >> > log4j:WARN Please initialize the log4j system properly.
> > > >> > log4j:WARN See
> > http://logging.apache.org/log4j/1.2/faq.html#noconfigfor
> > > >> > more info.
> > > >> > Exception in thread "main" org.apache.hadoop.yarn.YarnException:
> > > Failed
> > > >> > to Start org.apache.hadoop.yarn.server.nodemanager.NodeManager
> > > >> >       at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.service.CompositeService.start(CompositeService.java:80)
> > > >> >       at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.server.nodemanager.NodeManager.start(NodeManager.java:146)
> > > >> >       at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.server.nodemanager.NodeManager.main(NodeManager.java:191)
> > > >> > Caused by: org.apache.avro.AvroRuntimeException:
> > > >> > org.apache.hadoop.yarn.YarnException:
> > > >> > java.lang.reflect.InvocationTargetException
> > > >> >       at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.start(NodeStatusUpdaterImpl.java:140)
> > > >> >       at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.service.CompositeService.start(CompositeService.java:68)
> > > >> >       ... 2 more
> > > >> > Caused by: org.apache.hadoop.yarn.YarnException:
> > > >> > java.lang.reflect.InvocationTargetException
> > > >> >       at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.factories.impl.pb.RpcClientFactoryPBImpl.getClient(RpcClientFactoryPBImpl.java:70)
> > > >> >       at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.ipc.HadoopYarnProtoRPC.getProxy(HadoopYarnProtoRPC.java:35)
> > > >> >       at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.getRMClient(NodeStatusUpdaterImpl.java:158)
> > > >> >       at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.registerWithRM(NodeStatusUpdaterImpl.java:163)
> > > >> >       at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.start(NodeStatusUpdaterImpl.java:136)
> > > >> >       ... 3 more
> > > >> > Caused by: java.lang.reflect.InvocationTargetException
> > > >> >       at
> > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > > >> > Method)
> > > >> >       at
> > > >> >
> > >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> > > >> >       at
> > > >> >
> > >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> > > >> >       at
> > > java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> > > >> >       at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.factories.impl.pb.RpcClientFactoryPBImpl.getClient(RpcClientFactoryPBImpl.java:67)
> > > >> >       ... 7 more
> > > >> > Caused by: java.lang.NoClassDefFoundError:
> > > >> > org/apache/hadoop/ipc/ClientCache
> > > >> >       at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.ipc.ProtoOverHadoopRpcEngine.<clinit>(ProtoOverHadoopRpcEngine.java:63)
> > > >> >       at java.lang.Class.forName0(Native Method)
> > > >> >       at java.lang.Class.forName(Class.java:247)
> > > >> >       at
> > > >> >
> > >
> >
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1108)
> > > >> >       at
> > > >> >
> > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1160)
> > > >> >       at org.apache.hadoop.ipc.RPC.getProtocolEngine(RPC.java:94)
> > > >> >       at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:422)
> > > >> >       at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:368)
> > > >> >       at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:333)
> > > >> >       at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:461)
> > > >> >       at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:442)
> > > >> >       at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.server.api.impl.pb.client.ResourceTrackerPBClientImpl.<init>(ResourceTrackerPBClientImpl.java:32)
> > > >> >       ... 12 more
> > > >> > Caused by: java.lang.ClassNotFoundException:
> > > >> > org.apache.hadoop.ipc.ClientCache
> > > >> >       at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > > >> >       at java.security.AccessController.doPrivileged(Native
> Method)
> > > >> >       at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > > >> >       at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > >> >       at
> > sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > > >> >       at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > >> >       ... 24 more
> > > >> >
> > > >> > In addition, if I try to start resourcemanager instead, it logs:
> > > >> >
> > > >> > SLF4J: Class path contains multiple SLF4J bindings.
> > > >> > SLF4J: Found binding in
> > > >> >
> > >
> >
> [jar:file:/Users/matei/workspace/MR-279/common/build/ivy/lib/Hadoop-Common/common/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > > >> > SLF4J: Found binding in
> > > >> >
> > >
> >
> [jar:file:/Users/matei/workspace/MR-279-deploy/hadoop-mapreduce-1.0-SNAPSHOT/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > > >> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings
for
> an
> > > >> > explanation.
> > > >> > log4j:WARN No appenders could be found for logger
> > > >> > (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
> > > >> > log4j:WARN Please initialize the log4j system properly.
> > > >> > log4j:WARN See
> > http://logging.apache.org/log4j/1.2/faq.html#noconfigfor
> > > >> > more info.
> > > >> > Exception in thread "main" java.lang.IllegalStateException: For
> this
> > > >> > operation, current State must be STARTED instead of INITED
> > > >> >        at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.service.AbstractService.ensureCurrentState(AbstractService.java:101)
> > > >> >        at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.service.AbstractService.stop(AbstractService.java:69)
> > > >> >        at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.amlauncher.ApplicationMasterLauncher.stop(ApplicationMasterLauncher.java:90)
> > > >> >        at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.service.CompositeService.stop(CompositeService.java:89)
> > > >> >        at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.stop(ResourceManager.java:423)
> > > >> >        at
> > > >> >
> > >
> >
> org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:493)
> > > >> >
> > > >> > I don't know where the multiple SLF4J versions are coming from
> > because
> > > I
> > > >> > checked out common, hdfs and mapreduce at the same time.
> > > >> >
> > > >> > Matei
> > > >> >
> > > >> >
> > > >> >
> > > >> >
> > > >> > On Aug 11, 2011, at 2:01 AM, Vinod KV wrote:
> > > >> >
> > > >> >>
> > > >> >>
> > > >> >> Report YARN_* and HADOOP_* variables that you've set before
> > starting
> > > >> >> the daemons.
> > > >> >>
> > > >> >> Also run 'bin/yarn classpath' and post the output of that
command
> > > too.
> > > >> >> If the output doesn't contain
> > > yarn-server-resourcemanager-1.0-SNAPSHOT.jar,
> > > >> >> you are missing something.
> > > >> >>
> > > >> >> +Vinod
> > > >> >>
> > > >> >>
> > > >> >> On Thursday 11 August 2011 01:57 AM, Haoyuan Li wrote:
> > > >> >>> Hi,
> > > >> >>>
> > > >> >>> When I ran ResourceManager or NodeManager as steps here
> > > >> >>>
> > > >> >>>
> > >
> >
> http://svn.apache.org/repos/asf/hadoop/common/branches/MR-279/mapreduce/INSTALL
> > > >> >>> .
> > > >> >>> It has main class can't be found exception... I attached
the
> shell
> > > >> >>> output
> > > >> >>> here. Any help will be appreciated!
> > > >> >>>
> > > >> >>> Thanks,
> > > >> >>>
> > > >> >>> Haoyuan
> > > >> >>>
> > > >> >>> haoyuan@hya:~/hadoop/hadoop-mapreduce-1.0-SNAPSHOT$
> > > >> >>> ./bin/yarn-daemon.sh
> > > >> >>> start resourcemanager
> > > >> >>> starting resourcemanager, logging to
> > > >> >>>
> > > >> >>>
> > >
> >
> /home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../logs/yarn-haoyuan-resourcemanager-hya.out
> > > >> >>> /usr/lib/jvm/java-6-sun/bin/java -Dproc_resourcemanager
> -Xmx1000m
> > > >> >>>
> > > >> >>>
> > >
> >
> -Dhadoop.log.dir=/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../logs
> > > >> >>>
> > > >> >>>
> > >
> >
> -Dyarn.log.dir=/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../logs
> > > >> >>> -Dhadoop.log.file=yarn-haoyuan-resourcemanager-hya.log
> > > >> >>> -Dyarn.log.file=yarn-haoyuan-resourcemanager-hya.log
> > > -Dyarn.home.dir=
> > > >> >>> -Dyarn.id.str=haoyuan -Dhadoop.root.logger=INFO,DRFA
> > > >> >>> -Dyarn.root.logger=INFO,DRFA
> -Dyarn.policy.file=hadoop-policy.xml
> > > >> >>>
> > > >> >>>
> > >
> >
> -Dhadoop.log.dir=/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../logs
> > > >> >>>
> > > >> >>>
> > >
> >
> -Dyarn.log.dir=/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../logs
> > > >> >>> -Dhadoop.log.file=yarn-haoyuan-resourcemanager-hya.log
> > > >> >>> -Dyarn.log.file=yarn-haoyuan-resourcemanager-hya.log
> > > >> >>>
> > > >> >>>
> > >
> -Dyarn.home.dir=/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/..
> > > >> >>> -Dhadoop.root.logger=INFO,DRFA -Dyarn.root.logger=INFO,DRFA
> > > -classpath
> > > >> >>>
> > > >> >>>
> > >
> >
> /home/haoyuan/hadoop/conf:/home/haoyuan/hadoop/conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/home/haoyuan/hadoop/trunk/common/hadoop-*.jar:/home/haoyuan/hadoop/trunk/common/lib/*.jar:/home/haoyuan/hadoop/trunk/common/share/hadoop/common/*.jar:/home/haoyuan/hadoop/trunk/common/share/hadoop/common/lib/*.jar:/home/haoyuan/hadoop/trunk/common/share/hadoop/hdfs/*.jar:/home/haoyuan/hadoop/trunk/hdfs/hadoop-*.jar:/home/haoyuan/hadoop/trunk/hdfs/lib/*.jar:/home/haoyuan/hadoop/trunk/hdfs/build/classes:/home/haoyuan/hadoop/trunk/hdfs/hadoop-*.jar:/home/haoyuan/hadoop/trunk/hdfs/lib/*.jar:/home/haoyuan/hadoop/trunk/mapreduce/build/classes:/home/haoyuan/hadoop/trunk/mapreduce/build:/home/haoyuan/hadoop/trunk/mapreduce/build/test/classes:/home/haoyuan/hadoop/trunk/mapreduce/build/tools:/home/haoyuan/hadoop/trunk/mapreduce/lib/*.jar:/home/haoyuan/hadoop/trunk/mapreduce/*.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/aopalliance-1.0.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/asm-3.2.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/aspectjrt-1.6.5.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/avro-1.3.2.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/avro-1.4.1.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/clover-3.0.2.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/commons-beanutils-1.7.0.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/commons-beanutils-core-1.8.0.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/commons-cli-1.2.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/commons-codec-1.4.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/commons-collections-3.2.1.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/commons-configuration-1.6.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/commons-digester-1.8.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/commons-httpclient-3.1.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/commons-lang-2.5.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/commons-logging-1.0.4.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/commons-logging-api-1.0.4.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/commons-math-2.1.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/commons-net-1.4.1.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/core-3.1.1.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/guava-r09.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/guice-2.0.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/guice-servlet-2.0.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/hadoop-annotations-0.23.0-SNAPSHOT.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/jackson-core-asl-1.4.2.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/jackson-mapper-asl-1.4.2.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/jdiff-1.0.9.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/jets3t-0.6.1.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/jetty-6.1.26.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/jetty-util-6.1.26.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/jsp-api-2.1.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/junit-4.8.2.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/kfs-0.3.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/log4j-1.2.15.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/netty-3.2.3.Final.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/oro-2.0.8.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/paranamer-2.2.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/paranamer-ant-2.2.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/paranamer-generator-2.2.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/protobuf-java-2.4.0a.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/qdox-1.10.1.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/servlet-api-2.5.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/slf4j-api-1.6.1.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/slf4j-log4j12-1.6.1.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../lib/xmlenc-0.52.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../modules/hadoop-mapreduce-client-app-1.0-SNAPSHOT.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../modules/hadoop-mapreduce-client-common-1.0-SNAPSHOT.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../modules/hadoop-mapreduce-client-core-1.0-SNAPSHOT.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../modules/hadoop-mapreduce-client-hs-1.0-SNAPSHOT.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../modules/hadoop-mapreduce-client-jobclient-1.0-SNAPSHOT.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../modules/hadoop-mapreduce-client-shuffle-1.0-SNAPSHOT.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../modules/yarn-api-1.0-SNAPSHOT.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../modules/yarn-common-1.0-SNAPSHOT.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../modules/yarn-server-common-1.0-SNAPSHOT.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../modules/yarn-server-nodemanager-1.0-SNAPSHOT.jar:/home/haoyuan/hadoop/hadoop-mapreduce-1.0-SNAPSHOT/bin/../modules/yarn-server-resourcemanager-1.0-SNAPSHOT.jar:/home/haoyuan/hadoop/conf/rm-config/log4j.properties
> > > >> >>> org.apache.hadoop.yarn.server.resourcemanager.ResourceManager
> > > >> >>> Exception in thread "main" java.lang.NoClassDefFoundError:
> > > >> >>> org/apache/hadoop/conf/Configuration
> > > >> >>> Caused by: java.lang.ClassNotFoundException:
> > > >> >>> org.apache.hadoop.conf.Configuration
> > > >> >>> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > > >> >>> at java.security.AccessController.doPrivileged(Native
Method)
> > > >> >>> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > > >> >>> at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > >> >>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > > >> >>> at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > >> >>> Could not find the main class:
> > > >> >>> org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.
> > > >> >>>  Program will
> > > >> >>> exit.
> > > >> >>>
> > > >> >>
> > > >> >>
> > > >> >
> > > >>
> > > >
> > > >
> > >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message