sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Madhanmohan Savadamuthu <ermad...@gmail.com>
Subject Re: Sqoop 1.99.2 + Cloud Era Hadoop 200 error
Date Tue, 09 Jul 2013 15:54:30 GMT
1) When I configure Port number for Job Tracker as 8021 in
/etc/hadoop/conf/mapred-site.xml
Submission detailsSubmission details
Job id: 7
Status: FAILURE_ON_SUBMIT
Creation date: 2013-07-09 21:02:32 IST
Last update date: 2013-07-09 21:02:32 IST
Exception: java.lang.ClassCastException:
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetFileInfoRequestProto
cannot be cast to com.google.protobuf.Message
Stack trace: java.lang.ClassCastException:
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetFileInfoRequestProto
cannot be cast to com.google.protobuf.Message
    at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.constructRpcRequest(ProtobufRpcEngine.java:148)
    at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:193)
    at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
    at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
    at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
    at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:628)
    at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1545)
    at
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:805)
    at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1367)
    at
org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:109)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:952)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:946)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
    at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:946)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
    at
org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.submit(MapreduceSubmissionEngine.java:265)
    at
org.apache.sqoop.framework.FrameworkManager.submit(FrameworkManager.java:467)
    at
org.apache.sqoop.handler.SubmissionRequestHandler.submissionSubmit(SubmissionRequestHandler.java:112)
    at
org.apache.sqoop.handler.SubmissionRequestHandler.handleActionEvent(SubmissionRequestHandler.java:98)
    at
org.apache.sqoop.handler.SubmissionRequestHandler.handleEvent(SubmissionRequestHandler.java:68)
    at
org.apache.sqoop.server.v1.SubmissionServlet.handlePostRequest(SubmissionServlet.java:44)
    at
org.apache.sqoop.server.SqoopProtocolServlet.doPost(SqoopProtocolServlet.java:63)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:637)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
    at
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
    at
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
    at
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
    at
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
    at
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
    at
org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:861)
    at
org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:606)
    at
org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
    at java.lang.Thread.run(Thread.java:722)


2) When I configure Port number for Job Tracker as 9001/50300 in
/etc/hadoop/conf/mapred-site.xml
Job id: 7
Status: FAILURE_ON_SUBMIT
Creation date: 2013-07-09 21:21:14 IST
Last update date: 2013-07-09 21:21:14 IST
Exception: java.net.ConnectException: Call From
machine1.domain.com/111.11.1.1  to machine1.domain.com:9001 failed on
connection exception: java.net.ConnectException: Connection refused; For
more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
Stack trace: java.net.ConnectException: Call From
machine1.domain.com/111.11.1.1  to machine1.domain.com:9001 failed on
connection exception: java.net.ConnectException: Connection refused; For
more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
    at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:779)
    at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:726)
    at org.apache.hadoop.ipc.Client.call(Client.java:1229)
    at
org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
    at org.apache.hadoop.mapred.$Proxy11.getStagingAreaDir(Unknown Source)
    at
org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1325)
    at
org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:952)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:946)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
    at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:946)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
    at
org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.submit(MapreduceSubmissionEngine.java:265)
    at
org.apache.sqoop.framework.FrameworkManager.submit(FrameworkManager.java:467)
    at
org.apache.sqoop.handler.SubmissionRequestHandler.submissionSubmit(SubmissionRequestHandler.java:112)
    at
org.apache.sqoop.handler.SubmissionRequestHandler.handleActionEvent(SubmissionRequestHandler.java:98)
    at
org.apache.sqoop.handler.SubmissionRequestHandler.handleEvent(SubmissionRequestHandler.java:68)
    at
org.apache.sqoop.server.v1.SubmissionServlet.handlePostRequest(SubmissionServlet.java:44)
    at
org.apache.sqoop.server.SqoopProtocolServlet.doPost(SqoopProtocolServlet.java:63)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:637)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
    at
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
    at
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
    at
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
    at
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
    at
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
    at
org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:861)
    at
org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:606)
    at
org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
    at java.lang.Thread.run(Thread.java:722)
Caused by: java.net.ConnectException: Connection refused
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:692)
    at
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:207)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:525)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:489)
    at
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:499)
    at
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:593)
    at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:241)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:1278)
    at org.apache.hadoop.ipc.Client.call(Client.java:1196)
    ... 32 more

Let me know if more information is required.

Regards,
Madhan


On Tue, Jul 9, 2013 at 8:46 PM, Jarek Jarcec Cecho <jarcec@apache.org>wrote:

> Hi Madhanmohan,
> would you mind sharing entire exception that you are getting including the
> stack trace? Please note that you might need to enable verbose mode on
> client side in order to show entire exception stack trace. You can do that
> with following command:
>
>   set option --name verbose --value true
>
> Jarcec
>
> On Tue, Jul 09, 2013 at 07:26:28PM +0530, Madhanmohan Savadamuthu wrote:
> > After updated configuration for Job Tracker, I am getting error that
> > connection is refused when I submit a job.
> >
> > 1) I created a connection
> > 2) Create import job
> > 3) Tried submission of the Job. I am getting error message as connection
> > refused.
> >
> > Is it due to invalid configuration of Job Tracker port number?
> >
> > Regards,
> > Madhan
> > On Sat, Jul 6, 2013 at 12:34 AM, Mengwei Ding <mengwei.ding@cloudera.com
> >wrote:
> >
> > > Hi Madhan,
> > >
> > > It's really great to hear that the problem is clear. Enjoy Sqoop2. If
> you
> > > have further question, I will be more than happy to answer.
> > >
> > > Have a nice day!
> > >
> > > Best,
> > > Mengwei
> > >
> > >
> > > On Fri, Jul 5, 2013 at 11:42 AM, Madhanmohan Savadamuthu <
> > > ermadhan@gmail.com> wrote:
> > >
> > >> Mengwei,
> > >>
> > >> Issue is solved. MapReduce configuration file had invalid
> configuration
> > >> and was creating problem.
> > >>
> > >> *File Name:* /etc/hadoop/conf/mapred-site.xml
> > >> *Parameter Name:* mapred.job.tracker
> > >> *Original Value: *neededForHive:999999
> > >> *Modified Value: *<machinename>:9001
> > >>
> > >> After this change, this are working fine now. Note that I have
> followed
> > >> suggestions provided by Mengwei in this thread.
> > >>
> > >> Regards,
> > >> Madhan
> > >>
> > >>  On Thu, Jul 4, 2013 at 11:08 PM, Mengwei Ding <
> mengwei.ding@cloudera.com
> > >> > wrote:
> > >>
> > >>> Ok, Madhan, why not.
> > >>>
> > >>> Could you kindly provide you availabilities of time and communication
> > >>> tools. I will be more than happy to help you out with this.
> > >>>
> > >>> Best,
> > >>> Mengwei
> > >>>
> > >>>
> > >>> On Thu, Jul 4, 2013 at 1:15 AM, Madhanmohan Savadamuthu <
> > >>> ermadhan@gmail.com> wrote:
> > >>>
> > >>>> After doing changes in catalina.properties also, same issue is
> coming.
> > >>>>
> > >>>> is there any possibility for interactive discussion on this issue?
> > >>>>
> > >>>> Regards,
> > >>>> Madhan
> > >>>>
> > >>>>  On Wed, Jul 3, 2013 at 11:05 PM, Mengwei Ding <
> > >>>> mengwei.ding@cloudera.com> wrote:
> > >>>>
> > >>>>> Thank you for your prompt response, sir. Please don't worry,
I can
> > >>>>> help you out with this until your problem is done.
> > >>>>>
> > >>>>> Well, let's try out our new method of adding dependency jar
files,
> and
> > >>>>> forget about the addtowar.sh script.
> > >>>>>
> > >>>>> Please following these instructions:
> > >>>>>
> > >>>>>  "
> > >>>>> Installing Dependencies
> > >>>>>
> > >>>>> Hadoop libraries must be available on node where you are planning
> to
> > >>>>> run Sqoop server with proper configuration for major services
-
> NameNode
> > >>>>> and either JobTracker or ResourceManager depending whether
you are
> running
> > >>>>> Hadoop 1 or 2. There is no need to run any Hadoop service on
the
> same node
> > >>>>> as Sqoop server, just the libraries and configuration must
be
> available.
> > >>>>>
> > >>>>> Path to Hadoop libraries is stored in file catalina.properties
> inside
> > >>>>> directory server/conf. You need to change property called
> common.loader to
> > >>>>> contain all directories with your Hadoop libraries. The default
> expected
> > >>>>> locations are /usr/lib/hadoop and /usr/lib/hadoop/lib/. Please
> check out
> > >>>>> the comments in the file for further description how to configure
> different
> > >>>>> locations.
> > >>>>>
> > >>>>> Lastly you might need to install JDBC drivers that are not
bundled
> > >>>>> with Sqoop because of incompatible licenses. You can add any
> arbitrary Java
> > >>>>> jar file to Sqoop server by copying it into lib/ directory.
You
> can create
> > >>>>> this directory if it do not exists already.
> > >>>>> "
> > >>>>>
> > >>>>> I can give you my configuration as an example. So in my
> > >>>>> catalina.properties file, I have the following line:
> > >>>>>
> > >>>>> *
> > >>>>>
> common.loader=${catalina.base}/lib,${catalina.base}/lib/*.jar,${catalina.home}/lib,${catalina.home}/lib/*.jar,${catalina.home}/../lib/*.jar,/usr/lib/hadoop/client-0.20/*.jar,/home/mengweid/Downloads/mysql-connector-java-5.1.25-bin.jar
> > >>>>> *
> > >>>>>
> > >>>>> The */usr/lib/hadoop/client-0.20/*.jar *is used to include
all
> > >>>>> hadoop-related jars, and *mysql-connector-java-5.1.25-bin.jar
*is
> > >>>>> used for JDBC driver.
> > >>>>>
> > >>>>> Please try this, and let me know whether it works. Thank you.
> > >>>>>
> > >>>>> Best,
> > >>>>> Mengwei
> > >>>>>
> > >>>>>
> > >>>>> On Wed, Jul 3, 2013 at 9:18 AM, Madhanmohan Savadamuthu <
> > >>>>> ermadhan@gmail.com> wrote:
> > >>>>>
> > >>>>>> I did deployment as sugggested in below thread. I am not
able to
> > >>>>>> successfully use sqoop2. I am attaching the services log
for your
> > >>>>>> references.
> > >>>>>>
> > >>>>>> I made sure that exact same set of JAR files in appropriate
> location
> > >>>>>> and also deleted sqoop folder before starting the sqoop
server.
> > >>>>>>
> > >>>>>> *Error Message:*
> > >>>>>>  Exception has occurred during processing command
> > >>>>>> Exception: com.sun.jersey.api.client.UniformInterfaceException
> > >>>>>> Message: GET http://<ipaddress>:12013/sqoop/version
returned a
> > >>>>>> response status of 404 Not Found
> > >>>>>>
> > >>>>>> Regards,
> > >>>>>> Madhan
> > >>>>>>
> > >>>>>>  On Wed, Jul 3, 2013 at 7:30 PM, Mengwei Ding <
> > >>>>>> mengwei.ding@cloudera.com> wrote:
> > >>>>>>
> > >>>>>>> Hi Madhanmohan,
> > >>>>>>>
> > >>>>>>> Thank you for providing all these detailed information.
Help a
> lot
> > >>>>>>> to diagnose the problem.
> > >>>>>>>
> > >>>>>>> First, the addtowar.sh is not good enough for every
situation,
> > >>>>>>> we apologize for that. We have already figured out
a new way to
> add
> > >>>>>>> dependency library, which will coming out along with
next
> version of Sqoop2.
> > >>>>>>>
> > >>>>>>> Currently, it seems like the hadoop-core.jar has not
been added.
> I
> > >>>>>>> could show you all the libraries existing in the
> > >>>>>>> webapps/sqoop/WEB-INF/lib folder, please check below:
> > >>>>>>>  avro-1.7.4.jar
> > >>>>>>>  commons-cli-1.2.jar
> > >>>>>>> commons-configuration-1.6.jar
> > >>>>>>> commons-dbcp-1.4.jar
> > >>>>>>> commons-lang-2.5.jar
> > >>>>>>> commons-logging-1.1.1.jar
> > >>>>>>> commons-pool-1.5.4.jar
> > >>>>>>> derby-10.8.2.2.jar
> > >>>>>>> guava-11.0.2.jar
> > >>>>>>> hadoop-auth-2.0.0-cdh4.3.0.jar
> > >>>>>>> hadoop-common-2.0.0-cdh4.3.0.jar
> > >>>>>>> *hadoop-core-2.0.0-mr1-cdh4.3.0.jar*
> > >>>>>>> hadoop-hdfs-2.0.0-cdh4.3.0.jar
> > >>>>>>> hadoop-mapreduce-client-app-2.0.0-cdh4.3.0.jar
> > >>>>>>> hadoop-mapreduce-client-common-2.0.0-cdh4.3.0.jar
> > >>>>>>> hadoop-mapreduce-client-core-2.0.0-cdh4.3.0.jar
> > >>>>>>> hadoop-mapreduce-client-jobclient-2.0.0-cdh4.3.0.jar
> > >>>>>>> hadoop-yarn-api-2.0.0-cdh4.3.0.jar
> > >>>>>>> hadoop-yarn-common-2.0.0-cdh4.3.0.jar
> > >>>>>>>  jackson-core-asl-1.8.8.jar
> > >>>>>>> jackson-mapper-asl-1.8.8.jar
> > >>>>>>> json-simple-1.1.jar
> > >>>>>>> log4j-1.2.16.jar
> > >>>>>>> mysql-connector-java-5.1.25-bin.jar
> > >>>>>>> protobuf-java-2.4.0a.jar
> > >>>>>>> slf4j-api-1.6.1.jar
> > >>>>>>> slf4j-log4j12-1.6.1.jar
> > >>>>>>> sqoop-common-1.99.2.jar
> > >>>>>>> sqoop-connector-generic-jdbc-1.99.2.jar
> > >>>>>>> sqoop-core-1.99.2.jar
> > >>>>>>> sqoop-execution-mapreduce-1.99.2-hadoop200.jar
> > >>>>>>> sqoop-repository-derby-1.99.2.jar
> > >>>>>>> sqoop-spi-1.99.2.jar
> > >>>>>>> sqoop-submission-mapreduce-1.99.2-hadoop200.jar
> > >>>>>>>
> > >>>>>>> I have the same Hadoop and Sqoop2 installation directories
with
> you.
> > >>>>>>> And I am running a pseudo cluster in a single Ubuntu
virtual
> machine.
> > >>>>>>>
> > >>>>>>> So, now, you could try to add the hadoop-core.jar manually,
and
> then
> > >>>>>>> go ahead to see whether the sqoop2 server could run.
Please
> follow the
> > >>>>>>> following steps:
> > >>>>>>>
> > >>>>>>> *./bin/addtowar.sh
> > >>>>>>> -jars
> /usr/lib/hadoop-0.20-mapreduce/hadoop-core-2.0.0-mr1-cdh4.3.0.jar
> > >>>>>>> *
> > >>>>>>>
> > >>>>>>> Please find the hadoop-core.jar in your own machine.
It should
> be in
> > >>>>>>> a similar place. But still, if you have problem, please
let me
> know.
> > >>>>>>>
> > >>>>>>>
> > >>>>>>> The reason why it's better to remove the "sqoop" folder
is to
> clear
> > >>>>>>> the cached old servlet. Because Tomcat cannot alway
extract the
> sqoop.war
> > >>>>>>> file immediately after you add dependency library to
sqoop.war
> file. By
> > >>>>>>> removing the sqoop folder, the Tomcat is forced to
extract the
> sqoop.war to
> > >>>>>>> keep the sqoop folder up-to-date. So in this way, you
could know
> whether
> > >>>>>>> you have correctly setup the dependency library. Does
this
> explanation
> > >>>>>>> help?
> > >>>>>>>
> > >>>>>>> Best,
> > >>>>>>> Mengwei
> > >>>>>>>
> > >>>>>>>
> > >>>>>>> On Tue, Jul 2, 2013 at 9:19 PM, Madhanmohan Savadamuthu
<
> > >>>>>>> ermadhan@gmail.com> wrote:
> > >>>>>>>
> > >>>>>>>> Hi  Mengwei,
> > >>>>>>>>
> > >>>>>>>> Following are details
> > >>>>>>>>
> > >>>>>>>> Hadoop Version: Hadoop 2.0.0-cdh4.2.1
> > >>>>>>>> Linux Version: Linux version 2.6.32-358.2.1.el6.x86_64
(
> > >>>>>>>> mockbuild@x86-023.build.eng.bos.redhat.com) (gcc
version 4.4.7
> > >>>>>>>> 20120313 (Red Hat 4.4.7-3) (GCC) ) #1 SMP Wed Feb
20 12:17:37
> EST 2013
> > >>>>>>>> Hadoop Installation Location: /usr/lib/hadoop
> > >>>>>>>> Sqoop2 Installation Location: /usr/lib/sqoop2
> > >>>>>>>> Sqoop2 Dependency Configuration  Command Used:
./bin/addtowar.sh
> > >>>>>>>> -hadoop-auto
> > >>>>>>>> Files in :
> > >>>>>>>>
> > >>>>>>>> avro-1.7.3.jar
> > >>>>>>>> commons-cli-1.2.jar
> > >>>>>>>> commons-configuration-1.6.jar
> > >>>>>>>> commons-dbcp-1.4.jar
> > >>>>>>>> commons-lang-2.5.jar
> > >>>>>>>> commons-logging-1.1.1.jar
> > >>>>>>>> commons-pool-1.5.4.jar
> > >>>>>>>> derby-10.8.2.2.jar
> > >>>>>>>> guava-11.0.2.jar
> > >>>>>>>> hadoop-auth-2.0.0-cdh4.2.1.jar
> > >>>>>>>> hadoop-common-2.0.0-cdh4.2.1-tests.jar
> > >>>>>>>> hadoop-hdfs-2.0.0-cdh4.2.1.jar
> > >>>>>>>> hadoop-mapreduce-client-app-2.0.0-cdh4.2.1.jar
> > >>>>>>>> hadoop-mapreduce-client-common-2.0.0-cdh4.2.1.jar
> > >>>>>>>> hadoop-mapreduce-client-core-2.0.0-cdh4.2.1.jar
> > >>>>>>>> hadoop-mapreduce-client-jobclient-2.0.0-cdh4.2.1.jar
> > >>>>>>>> hadoop-yarn-api-2.0.0-cdh4.2.1.jar
> > >>>>>>>> hadoop-yarn-common-2.0.0-cdh4.2.1.jar
> > >>>>>>>> jackson-core-asl-1.8.8.jar
> > >>>>>>>> jackson-mapper-asl-1.8.8.jar
> > >>>>>>>> json-simple-1.1.jar
> > >>>>>>>> log4j-1.2.16.jar
> > >>>>>>>> mysql-connector-java-5.1.25-bin.jar
> > >>>>>>>> protobuf-java-2.4.0a.jar
> > >>>>>>>> slf4j-api-1.6.1.jar
> > >>>>>>>> slf4j-log4j12-1.6.1.jar
> > >>>>>>>> sqoop-common-1.99.2.jar
> > >>>>>>>> sqoop-connector-generic-jdbc-1.99.2.jar
> > >>>>>>>> sqoop-core-1.99.2.jar
> > >>>>>>>> sqoop-execution-mapreduce-1.99.2-hadoop200.jar
> > >>>>>>>> sqoop-repository-derby-1.99.2.jar
> > >>>>>>>> sqoop-spi-1.99.2.jar
> > >>>>>>>> sqoop-submission-mapreduce-1.99.2-hadoop200.jar
> > >>>>>>>>
> > >>>>>>>> Can you elaborate more about deletion of 'sqoop'
folder?
> > >>>>>>>>
> > >>>>>>>> Regards,
> > >>>>>>>> Madhanmohan S
> > >>>>>>>>
> > >>>>>>>>  On Tue, Jul 2, 2013 at 10:50 PM, Mengwei Ding
<
> > >>>>>>>> mengwei.ding@cloudera.com> wrote:
> > >>>>>>>>
> > >>>>>>>>> Hi Madhanmohan,
> > >>>>>>>>>
> > >>>>>>>>> Thank you for your interest in Sqoop2. It's
really great to
> hear
> > >>>>>>>>> this. And thank you for providing details for
your question.
> Let me help
> > >>>>>>>>> you out with this.
> > >>>>>>>>>
> > >>>>>>>>> This main reason for your situation is that
the Sqoop servlet
> has
> > >>>>>>>>> not been started successfully, so the client
get connection
> refused. I have
> > >>>>>>>>> gone through you attachments. The reason of
servlet failure is
> that your
> > >>>>>>>>> Hadoop dependency library has not be configured
correctly.
> Could you kindly
> > >>>>>>>>> answer my following questions, so that I could
help with you
> further.
> > >>>>>>>>>
> > >>>>>>>>> 1. Your Hadoop version and installation location?
You operating
> > >>>>>>>>> system?
> > >>>>>>>>> 2. The details of how you configure the dependency
library for
> > >>>>>>>>> sqoop?
> > >>>>>>>>> 3. Could you kindly go to
> > >>>>>>>>> [sqoop_install_dir]/server/server/webapps/sqoop/WEB-INF/lib
> and list all
> > >>>>>>>>> the jar files?
> > >>>>>>>>>
> > >>>>>>>>> PS: remember to delete the sqoop folder under
> > >>>>>>>>> server/server/webapps every time after you
configure the
> dependency library.
> > >>>>>>>>>
> > >>>>>>>>> Best,
> > >>>>>>>>> Mengwei
> > >>>>>>>>>
> > >>>>>>>>>
> > >>>>>>>>>  On Tue, Jul 2, 2013 at 10:05 AM, Madhanmohan
Savadamuthu <
> > >>>>>>>>> ermadhan@gmail.com> wrote:
> > >>>>>>>>>
> > >>>>>>>>>>  I have setup Sqoop 1.99.2 as mentioned
in
> sqoop.apache.orginstruction. When I try to show version --all command,
> following error is
> > >>>>>>>>>> coming.
> > >>>>>>>>>>
> > >>>>>>>>>> Sqoop 1.99.2 revision 3e31b7d3eefb3696d4970704364dea05a9ea2a59
> > >>>>>>>>>>   Compiled by homeuser on Mon Apr 15 20:50:13
PDT 2013
> > >>>>>>>>>> Exception has occurred during processing
command
> > >>>>>>>>>> Exception: com.sun.jersey.api.client.ClientHandlerException
> > >>>>>>>>>> Message: java.net.ConnectException: Connection
refused
> > >>>>>>>>>>
> > >>>>>>>>>> all log files are attached for reference.
> > >>>>>>>>>>
> > >>>>>>>>>>
> > >>>>>>>>>>
> > >>>>>>>>>> --
> > >>>>>>>>>> Thanks and Regards,
> > >>>>>>>>>> Madhanmohan S
> > >>>>>>>>>>
> > >>>>>>>>>
> > >>>>>>>>>
> > >>>>>>>>
> > >>>>>>>>
> > >>>>>>>> --
> > >>>>>>>> Thanks and Regards,
> > >>>>>>>> Madhanmohan S
> > >>>>>>>>
> > >>>>>>>
> > >>>>>>>
> > >>>>>>
> > >>>>>>
> > >>>>>> --
> > >>>>>> Thanks and Regards,
> > >>>>>> Madhanmohan S
> > >>>>>>
> > >>>>>
> > >>>>>
> > >>>>
> > >>>>
> > >>>> --
> > >>>> Thanks and Regards,
> > >>>> Madhanmohan S
> > >>>>
> > >>>
> > >>>
> > >>
> > >>
> > >> --
> > >> Thanks and Regards,
> > >> Madhanmohan S
> > >>
> > >
> > >
> >
> >
> > --
> > Thanks and Regards,
> > Madhanmohan S
>



-- 
Thanks and Regards,
Madhanmohan S

Mime
View raw message