sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Abraham Elmahrek <...@cloudera.com>
Subject Re: Import data to HDFS using Sqoop2
Date Wed, 04 Sep 2013 01:23:28 GMT
What database are you using? Are table names or schema names case sensitive
in your database? Sqoop quotes all names, so case sensitivity makes a
difference here.

Also, could you provide the corresponding Sqoop 1.4.4 command that worked
for you?

-Abe


On Tue, Sep 3, 2013 at 6:06 PM, 陳彥廷(Yanting Chen) <mx.alexender@gmail.com>wrote:

> I am pretty sure that the schema "invoice" exists because I successfully
> imported data to this schema using Sqoop 1.4.4
>
> Also, I already remove "*" when creating job.
>
> The following is the table in invoice schema:
>     TABLE DS_MSG_LOG
>     (
>         MESSAGE_ID VARCHAR2(23) NOT NULL,
>         LOGIN_ID VARCHAR2(30),
>         ACPT_DTS TIMESTAMP(6),
>         DLVR_DTS TIMESTAMP(6),
>         SENDER_ID VARCHAR2(30),
>         SENDER_VAC_ID VARCHAR2(39),
>         RECEIVER_ID VARCHAR2(30),
>         RECEIVER_VAC_ID VARCHAR2(39),
>         STATUS VARCHAR2(1),
>         MESSAGE_TYPE VARCHAR2(8),
>         FLOW_TYPE VARCHAR2(5),
>         SERVICE_TYPE VARCHAR2(1),
>         SOURCE_FILE_NAME VARCHAR2(150),
>         ARCHIVE_FILE_NAME VARCHAR2(250),
>         ARCHIVE_CHAR_COUNT NUMBER,
>         DECRYPT_FILE_NAME VARCHAR2(250),
>         DECRYPT_CHAR_COUNT NUMBER,
>         RESP_FILE_NAME VARCHAR2(250),
>         RESP_CHAR_COUNT NUMBER,
>         RESP_FLAG VARCHAR2(1),
>         RTG_SEQ VARCHAR2(8),
>         RESENT_FLAG VARCHAR2(1) DEFAULT 'N',
>         TOTAL_INV_COUNT NUMBER,
>         CONSTRAINT PK_DS_MSG_LOG PRIMARY KEY (MESSAGE_ID)
>     )
>
>
> On Wed, Sep 4, 2013 at 1:46 AM, Abraham Elmahrek <abe@cloudera.com> wrote:
>
>> Hey User,
>>
>> It looks like the schema "invoice" does not exist in your database. Could
>> you please provide your database schema? Also, "*" is unnecessary when
>> specifying "Table column names". If you leave it blank it will import all
>> columns by default.
>>
>> -Abe
>>
>>
>> On Tue, Sep 3, 2013 at 3:03 AM, 陳彥廷(Yanting Chen) <mx.alexender@gmail.com
>> > wrote:
>>
>>> According to the official guide,
>>> http://sqoop.apache.org/docs/1.99.2/Sqoop5MinutesDemo.html , I
>>> successfully created a job.
>>>
>>> However, when I executed the command, submission start --jid 1, I got
>>> this error message: "Exception has occurred during processing command
>>> Server has returned exception: Exception: java.lang.Throwable Message:
>>> GENERIC_JDBC_CONNECTOR_0002:Unable to execute the SQL statement"
>>>
>>> This is the information of my job.
>>>
>>> Database configuration
>>>
>>>
>>> Schema name: invoice
>>> Table name: ds_msg_log
>>> Table SQL statement:
>>> Table column names: *
>>> Partition column name:
>>> Boundary query:
>>>
>>> Output configuration
>>>
>>>
>>> Storage type: HDFS
>>> Output format: TEXT_FILE
>>> Output directory: /user/root/ds_msg_log
>>>
>>> Throttling resources Extractors: Loaders:
>>>
>>> Since there is no information in the official guide talking about how to
>>> set the values above, does any know anything wrong in my job setting?
>>>
>>> This is the log: Stack trace: at
>>> org.apache.sqoop.connector.jdbc.GenericJdbcExecutor
>>> (GenericJdbcExecutor.java:59)
>>> at org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer
>>> (GenericJdbcImportInitializer.java:155)
>>> at org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer
>>> (GenericJdbcImportInitializer.java:48)
>>> at org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer
>>> (GenericJdbcImportInitializer.java:37)
>>> at org.apache.sqoop.framework.FrameworkManager
>>> (FrameworkManager.java:447)
>>> at org.apache.sqoop.handler.SubmissionRequestHandler
>>> (SubmissionRequestHandler.java:112)
>>> at org.apache.sqoop.handler.SubmissionRequestHandler
>>> (SubmissionRequestHandler.java:98)
>>> at org.apache.sqoop.handler.SubmissionRequestHandler
>>> (SubmissionRequestHandler.java:68)
>>> at org.apache.sqoop.server.v1.SubmissionServlet
>>> (SubmissionServlet.java:44)
>>> at org.apache.sqoop.server.SqoopProtocolServlet
>>> (SqoopProtocolServlet.java:63)
>>> at javax.servlet.http.HttpServlet (HttpServlet.java:637)
>>> at javax.servlet.http.HttpServlet (HttpServlet.java:717)
>>> at org.apache.catalina.core.ApplicationFilterChain
>>> (ApplicationFilterChain.java:290)
>>> at org.apache.catalina.core.ApplicationFilterChain
>>> (ApplicationFilterChain.java:206)
>>> at org.apache.catalina.core.StandardWrapperValve
>>> (StandardWrapperValve.java:233)
>>> at org.apache.catalina.core.StandardContextValve
>>> (StandardContextValve.java:191)
>>> at org.apache.catalina.core.StandardHostValve
>>> (StandardHostValve.java:127)
>>> at org.apache.catalina.valves.ErrorReportValve
>>> (ErrorReportValve.java:102)
>>> at org.apache.catalina.core.StandardEngineValve
>>> (StandardEngineValve.java:109)
>>> at org.apache.catalina.connector.CoyoteAdapter (CoyoteAdapter.java:293)
>>> at org.apache.coyote.http11.Http11Processor (Http11Processor.java:859)
>>> at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler
>>> (Http11Protocol.java:602)
>>> at org.apache.tomcat.util.net.JIoEndpoint$Worker (JIoEndpoint.java:489)
>>> at java.lang.Thread (Thread.java:724)
>>> Caused by: Exception: java.lang.Throwable Message: ERROR: schema
>>> "invoice" does not exist Position: 46 Stack trace: at
>>> org.postgresql.core.v3.QueryExecutorImpl (QueryExecutorImpl.java:2102)
>>> at org.postgresql.core.v3.QueryExecutorImpl (QueryExecutorImpl.java:1835)
>>> at org.postgresql.core.v3.QueryExecutorImpl (QueryExecutorImpl.java:257)
>>> at org.postgresql.jdbc2.AbstractJdbc2Statement
>>> (AbstractJdbc2Statement.java:500)
>>> at org.postgresql.jdbc2.AbstractJdbc2Statement
>>> (AbstractJdbc2Statement.java:374)
>>> at org.postgresql.jdbc2.AbstractJdbc2Statement
>>> (AbstractJdbc2Statement.java:254)
>>> at org.apache.sqoop.connector.jdbc.GenericJdbcExecutor
>>> (GenericJdbcExecutor.java:56)
>>> at org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer
>>> (GenericJdbcImportInitializer.java:155)
>>> at org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer
>>> (GenericJdbcImportInitializer.java:48)
>>> at org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer
>>> (GenericJdbcImportInitializer.java:37)
>>> at org.apache.sqoop.framework.FrameworkManager
>>> (FrameworkManager.java:447)
>>> at org.apache.sqoop.handler.SubmissionRequestHandler
>>> (SubmissionRequestHandler.java:112)
>>> at org.apache.sqoop.handler.SubmissionRequestHandler
>>> (SubmissionRequestHandler.java:98)
>>> at org.apache.sqoop.handler.SubmissionRequestHandler
>>> (SubmissionRequestHandler.java:68)
>>> at org.apache.sqoop.server.v1.SubmissionServlet
>>> (SubmissionServlet.java:44)
>>> at org.apache.sqoop.server.SqoopProtocolServlet
>>> (SqoopProtocolServlet.java:63)
>>> at javax.servlet.http.HttpServlet (HttpServlet.java:637)
>>> at javax.servlet.http.HttpServlet (HttpServlet.java:717)
>>> at org.apache.catalina.core.ApplicationFilterChain
>>> (ApplicationFilterChain.java:290)
>>> at org.apache.catalina.core.ApplicationFilterChain
>>> (ApplicationFilterChain.java:206)
>>> at org.apache.catalina.core.StandardWrapperValve
>>> (StandardWrapperValve.java:233)
>>> at org.apache.catalina.core.StandardContextValve
>>> (StandardContextValve.java:191)
>>> at org.apache.catalina.core.StandardHostValve
>>> (StandardHostValve.java:127)
>>> at org.apache.catalina.valves.ErrorReportValve
>>> (ErrorReportValve.java:102)
>>> at org.apache.catalina.core.StandardEngineValve
>>> (StandardEngineValve.java:109)
>>> at org.apache.catalina.connector.CoyoteAdapter (CoyoteAdapter.java:293)
>>> at org.apache.coyote.http11.Http11Processor (Http11Processor.java:859)
>>> at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler
>>> (Http11Protocol.java:602)
>>> at org.apache.tomcat.util.net.JIoEndpoint$Worker (JIoEndpoint.java:489)
>>> at java.lang.Thread (Thread.java:724)
>>>
>>
>>
>

Mime
View raw message