sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Yanting Chen <mx.alexen...@gmail.com>
Subject Re: Import data to HDFS using Sqoop2
Date Tue, 10 Sep 2013 09:19:37 GMT
Hi Abraham

Please ignore my last post containing table schema in Oracle.

The following is the schema I use to create table ds_msg_log in PostgreSQL

CREATE TABLE "public"."ds_msg_log" (
"message_id" text NOT NULL,
"login_id" text,
"acpt_dts" timestamp(6),
"dlvr_dts" timestamp(6),
"sender_id" text,
"sender_vac_id" text,
"receiver_id" text,
"receiver_vac_id" text,
"status" text,
"message_type" text,
"flow_type" text,
"service_type" text,
"source_file_name" text,
"archive_file_name" text,
"archive_char_count" numeric(32),
"decrypt_file_name" text,
"decrypt_char_count" numeric(32),
"resp_file_name" text,
"resp_char_count" numeric(32),
"resp_flag" text,
"rtg_seq" text,
"resent_flag" text DEFAULT 'N'::character varying,
"total_inv_count" numeric(32),
CONSTRAINT "ds_msg_log_pkey" PRIMARY KEY ("message_id")
)
WITH (OIDS=FALSE)
;


On Tue, Sep 10, 2013 at 12:22 AM, Abraham Elmahrek <abe@cloudera.com> wrote:

> Yanting,
>
> This can't be the schema for postgresql. There are datatypes that
> postgresql doesn't support here. Could you provide the schema from
> postgresql?
>
> -Abe
>
>
> On Mon, Sep 9, 2013 at 12:18 AM, Yanting Chen <mx.alexender@gmail.com>wrote:
>
>> Hi Mengwei
>>
>> After seeing my table schema, do you have any advanced idea about this
>> problem?
>>
>>
>>
>>
>> On Fri, Sep 6, 2013 at 9:02 AM, Yanting Chen <mx.alexender@gmail.com>wrote:
>>
>>> TABLE DS_MSG_LOG
>>>     (
>>>         MESSAGE_ID VARCHAR2(23) NOT NULL,
>>>         LOGIN_ID VARCHAR2(30),
>>>         ACPT_DTS TIMESTAMP(6),
>>>         DLVR_DTS TIMESTAMP(6),
>>>         SENDER_ID VARCHAR2(30),
>>>         SENDER_VAC_ID VARCHAR2(39),
>>>         RECEIVER_ID VARCHAR2(30),
>>>         RECEIVER_VAC_ID VARCHAR2(39),
>>>         STATUS VARCHAR2(1),
>>>         MESSAGE_TYPE VARCHAR2(8),
>>>         FLOW_TYPE VARCHAR2(5),
>>>         SERVICE_TYPE VARCHAR2(1),
>>>         SOURCE_FILE_NAME VARCHAR2(150),
>>>         ARCHIVE_FILE_NAME VARCHAR2(250),
>>>         ARCHIVE_CHAR_COUNT NUMBER,
>>>         DECRYPT_FILE_NAME VARCHAR2(250),
>>>         DECRYPT_CHAR_COUNT NUMBER,
>>>         RESP_FILE_NAME VARCHAR2(250),
>>>         RESP_CHAR_COUNT NUMBER,
>>>         RESP_FLAG VARCHAR2(1),
>>>         RTG_SEQ VARCHAR2(8),
>>>         RESENT_FLAG VARCHAR2(1) DEFAULT 'N',
>>>         TOTAL_INV_COUNT NUMBER,
>>>         CONSTRAINT PK_DS_MSG_LOG PRIMARY KEY (MESSAGE_ID)
>>>     )
>>>
>>>
>>> On Fri, Sep 6, 2013 at 1:55 AM, Abraham Elmahrek <abe@cloudera.com>wrote:
>>>
>>>> Could you provide your schema from PostGreSQL? Mengwei is likely right.
>>>>
>>>>
>>>> On Wed, Sep 4, 2013 at 7:36 PM, Yanting Chen <mx.alexender@gmail.com>wrote:
>>>>
>>>>> Actually the schema comes from Oracle. However, I try to modify it to
>>>>> let it fit into PostgreSQL database. So, now the database I use is
>>>>> PostgreSQL.
>>>>>
>>>>>
>>>>> On Thu, Sep 5, 2013 at 10:33 AM, Abraham Elmahrek <abe@cloudera.com>wrote:
>>>>>
>>>>>> Yanting,
>>>>>>
>>>>>> Also, it seems like the schema you've provided is for an Oracle
>>>>>> database. i.e. VARCHAR2 and NUMBER are datatypes specific to Oracle.
Could
>>>>>> you please use an oracle connection string and driver? i.e. oracle.jdbc.driver.OracleDriver
>>>>>> and jdbc:oracle:thin:@host:port:SID.
>>>>>>
>>>>>> -abe
>>>>>>
>>>>>>
>>>>>> On Wed, Sep 4, 2013 at 7:32 PM, Mengwei Ding <mengwei.ding@gmail.com>wrote:
>>>>>>
>>>>>>> Hmm... would you mind showing us your most updated job configuration
>>>>>>> by typing "show job --jid 3"? I just want to make sure that you
provide the
>>>>>>> partition column correctly.
>>>>>>>
>>>>>>> Also, I notice that the primary key for this table is "VARCHAR(23)"
>>>>>>> type, this might be the problem.
>>>>>>>
>>>>>>> Mengwei
>>>>>>>
>>>>>>>
>>>>>>> On Wed, Sep 4, 2013 at 10:23 PM, Yanting Chen <
>>>>>>> mx.alexender@gmail.com> wrote:
>>>>>>>
>>>>>>>> Hi Mengwei
>>>>>>>>
>>>>>>>> I try to set primary key into partition column. But still
the same
>>>>>>>> error!
>>>>>>>>
>>>>>>>>
>>>>>>>> On Thu, Sep 5, 2013 at 10:17 AM, Mengwei Ding <
>>>>>>>> mengwei.ding@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Hi Yanting,
>>>>>>>>>
>>>>>>>>> It seems like you did not specify the 'partition column'
for the
>>>>>>>>> job. Generally, the primary key of  the table is a good
choice for
>>>>>>>>> 'parition column'.
>>>>>>>>>
>>>>>>>>> You could use 'update job -jid 3' to update the job configuration.
>>>>>>>>>
>>>>>>>>> Mengwei
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Wed, Sep 4, 2013 at 9:37 PM, Yanting Chen <
>>>>>>>>> mx.alexender@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Abraham,
>>>>>>>>>>
>>>>>>>>>> Thanks for you answer. I review my database.
>>>>>>>>>> I think the database name is invoice and the schema
name is
>>>>>>>>>> public, just like the picture below.
>>>>>>>>>> http://imgur.com/ns0iNLi
>>>>>>>>>>
>>>>>>>>>> So, I change the schema name with a new value, "public"
>>>>>>>>>> Then, I run this job and get a different error.
>>>>>>>>>>
>>>>>>>>>> Status: FAILURE_ON_SUBMIT
>>>>>>>>>> Creation date: 2013-09-05 09:30:44 CST
>>>>>>>>>> Last update date: 2013-09-05 09:30:44 CST
>>>>>>>>>> Exception: org.apache.sqoop.common.SqoopException:
>>>>>>>>>> GENERIC_JDBC_CONNECTOR_0011:The type is not supported
- 12
>>>>>>>>>> Stack trace: org.apache.sqoop.common.SqoopException:
>>>>>>>>>> GENERIC_JDBC_CONNECTOR_0011:The type is not supported
- 12
>>>>>>>>>> at
>>>>>>>>>> org.apache.sqoop.connector.jdbc.GenericJdbcImportPartitioner.getPartitions(GenericJdbcImportPartitioner.java:87)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.sqoop.connector.jdbc.GenericJdbcImportPartitioner.getPartitions(GenericJdbcImportPartitioner.java:32)
>>>>>>>>>> at
>>>>>>>>>> org.apache.sqoop.job.mr.SqoopInputFormat.getSplits(SqoopInputFormat.java:71)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:452)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:469)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:366)
>>>>>>>>>> at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1269)
>>>>>>>>>>  at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1266)
>>>>>>>>>> at java.security.AccessController.doPrivileged(Native
Method)
>>>>>>>>>>  at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>>>>>> at
>>>>>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>>>>>>>>>>  at org.apache.hadoop.mapreduce.Job.submit(Job.java:1266)
>>>>>>>>>> at
>>>>>>>>>> org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.submit(MapreduceSubmissionEngine.java:265)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.sqoop.framework.FrameworkManager.submit(FrameworkManager.java:480)
>>>>>>>>>> at
>>>>>>>>>> org.apache.sqoop.handler.SubmissionRequestHandler.submissionSubmit(SubmissionRequestHandler.java:112)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.sqoop.handler.SubmissionRequestHandler.handleActionEvent(SubmissionRequestHandler.java:98)
>>>>>>>>>> at
>>>>>>>>>> org.apache.sqoop.handler.SubmissionRequestHandler.handleEvent(SubmissionRequestHandler.java:68)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.sqoop.server.v1.SubmissionServlet.handlePostRequest(SubmissionServlet.java:44)
>>>>>>>>>> at
>>>>>>>>>> org.apache.sqoop.server.SqoopProtocolServlet.doPost(SqoopProtocolServlet.java:63)
>>>>>>>>>>  at javax.servlet.http.HttpServlet.service(HttpServlet.java:637)
>>>>>>>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
>>>>>>>>>> at
>>>>>>>>>> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:859)
>>>>>>>>>> at
>>>>>>>>>> org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:602)
>>>>>>>>>>  at
>>>>>>>>>> org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
>>>>>>>>>> at java.lang.Thread.run(Thread.java:724)
>>>>>>>>>>
>>>>>>>>>> Also, I try to remove the schema name, but I get
the same error
>>>>>>>>>> as above.
>>>>>>>>>>
>>>>>>>>>> On Thu, Sep 5, 2013 at 6:11 AM, Abraham Elmahrek
<
>>>>>>>>>> abe@cloudera.com> wrote:
>>>>>>>>>>
>>>>>>>>>> Yanting,
>>>>>>>>>>>
>>>>>>>>>>> I'm sorry, I'm a bit confused. The database you
are using here
>>>>>>>>>>> is called "invoice". Do you also have a schema
called "invoice"? Try
>>>>>>>>>>> removing the "Schema name" from your job and
see if that works for you.
>>>>>>>>>>>
>>>>>>>>>>> http://www.postgresql.org/docs/9.1/static/ddl-schemas.html
>>>>>>>>>>>
>>>>>>>>>>> -Abe
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Tue, Sep 3, 2013 at 6:34 PM, Yanting Chen
<
>>>>>>>>>>> mx.alexender@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> However, with Sqoop 1.4.4, all schema and
table names are
>>>>>>>>>>>> lower-case, and it is successful to make
import.
>>>>>>>>>>>>
>>>>>>>>>>>> As a result, I am not sure if case sensitivity
will be a
>>>>>>>>>>>> problem in Sqoop 2.
>>>>>>>>>>>>
>>>>>>>>>>>> *This is the connection I created:*
>>>>>>>>>>>> 1 connection(s) to show:
>>>>>>>>>>>> Connection with id 1 and name ds_msg_log
(Created 9/2/13 10:54
>>>>>>>>>>>> AM, Updated 9/3/13 6:35 PM)
>>>>>>>>>>>> Using Connector id 1
>>>>>>>>>>>>   Connection configuration
>>>>>>>>>>>>     JDBC Driver Class: org.postgresql.Driver
>>>>>>>>>>>>     JDBC Connection String:
>>>>>>>>>>>> jdbc:postgresql://MY_IP:MY_PORT/invoice
>>>>>>>>>>>>     Username: hadoop
>>>>>>>>>>>>     Password:
>>>>>>>>>>>>     JDBC Connection Properties:
>>>>>>>>>>>>   Security related configuration options
>>>>>>>>>>>>     Max connections: 123
>>>>>>>>>>>>
>>>>>>>>>>>> *This is the job I created:*
>>>>>>>>>>>> Job with id 3 and name ds_msg_log (Created
9/2/13 11:37 AM,
>>>>>>>>>>>> Updated 9/3/13 5:57 PM)
>>>>>>>>>>>> Using Connection id 1 and Connector id 1
>>>>>>>>>>>>   Database configuration
>>>>>>>>>>>>     Schema name: invoice
>>>>>>>>>>>>     Table name: ds_msg_log
>>>>>>>>>>>>     Table SQL statement:
>>>>>>>>>>>>     Table column names:
>>>>>>>>>>>>     Partition column name:
>>>>>>>>>>>>     Boundary query:
>>>>>>>>>>>>   Output configuration
>>>>>>>>>>>>     Storage type: HDFS
>>>>>>>>>>>>     Output format: TEXT_FILE
>>>>>>>>>>>>     Output directory: /user/root/ds_msg_log_sqoop2
>>>>>>>>>>>>   Throttling resources
>>>>>>>>>>>>     Extractors:
>>>>>>>>>>>>     Loaders:
>>>>>>>>>>>>
>>>>>>>>>>>> *The command I use to start this job is:*
>>>>>>>>>>>> submission start --jid 3
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Wed, Sep 4, 2013 at 9:24 AM, Abraham Elmahrek
<
>>>>>>>>>>>> abe@cloudera.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Sorry, I missed that you are using PostGreSQL.
I think case
>>>>>>>>>>>>> sensitivity will be a problem with that
database.
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Tue, Sep 3, 2013 at 6:23 PM, Abraham
Elmahrek <
>>>>>>>>>>>>> abe@cloudera.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> What database are you using? Are
table names or schema names
>>>>>>>>>>>>>> case sensitive in your database?
Sqoop quotes all names, so case
>>>>>>>>>>>>>> sensitivity makes a difference here.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Also, could you provide the corresponding
Sqoop 1.4.4 command
>>>>>>>>>>>>>> that worked for you?
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> -Abe
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Tue, Sep 3, 2013 at 6:06 PM, 陳彥廷(Yanting
Chen) <
>>>>>>>>>>>>>> mx.alexender@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> I am pretty sure that the schema
"invoice" exists because I
>>>>>>>>>>>>>>> successfully imported data to
this schema using Sqoop 1.4.4
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Also, I already remove "*" when
creating job.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> The following is the table in
invoice schema:
>>>>>>>>>>>>>>>     TABLE DS_MSG_LOG
>>>>>>>>>>>>>>>     (
>>>>>>>>>>>>>>>         MESSAGE_ID VARCHAR2(23)
NOT NULL,
>>>>>>>>>>>>>>>         LOGIN_ID VARCHAR2(30),
>>>>>>>>>>>>>>>         ACPT_DTS TIMESTAMP(6),
>>>>>>>>>>>>>>>         DLVR_DTS TIMESTAMP(6),
>>>>>>>>>>>>>>>         SENDER_ID VARCHAR2(30),
>>>>>>>>>>>>>>>         SENDER_VAC_ID VARCHAR2(39),
>>>>>>>>>>>>>>>         RECEIVER_ID VARCHAR2(30),
>>>>>>>>>>>>>>>         RECEIVER_VAC_ID VARCHAR2(39),
>>>>>>>>>>>>>>>         STATUS VARCHAR2(1),
>>>>>>>>>>>>>>>         MESSAGE_TYPE VARCHAR2(8),
>>>>>>>>>>>>>>>         FLOW_TYPE VARCHAR2(5),
>>>>>>>>>>>>>>>         SERVICE_TYPE VARCHAR2(1),
>>>>>>>>>>>>>>>         SOURCE_FILE_NAME VARCHAR2(150),
>>>>>>>>>>>>>>>         ARCHIVE_FILE_NAME VARCHAR2(250),
>>>>>>>>>>>>>>>         ARCHIVE_CHAR_COUNT NUMBER,
>>>>>>>>>>>>>>>         DECRYPT_FILE_NAME VARCHAR2(250),
>>>>>>>>>>>>>>>         DECRYPT_CHAR_COUNT NUMBER,
>>>>>>>>>>>>>>>         RESP_FILE_NAME VARCHAR2(250),
>>>>>>>>>>>>>>>         RESP_CHAR_COUNT NUMBER,
>>>>>>>>>>>>>>>         RESP_FLAG VARCHAR2(1),
>>>>>>>>>>>>>>>         RTG_SEQ VARCHAR2(8),
>>>>>>>>>>>>>>>         RESENT_FLAG VARCHAR2(1)
DEFAULT 'N',
>>>>>>>>>>>>>>>         TOTAL_INV_COUNT NUMBER,
>>>>>>>>>>>>>>>         CONSTRAINT PK_DS_MSG_LOG
PRIMARY KEY (MESSAGE_ID)
>>>>>>>>>>>>>>>     )
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Wed, Sep 4, 2013 at 1:46 AM,
Abraham Elmahrek <
>>>>>>>>>>>>>>> abe@cloudera.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Hey User,
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> It looks like the schema
"invoice" does not exist in your
>>>>>>>>>>>>>>>> database. Could you please
provide your database schema? Also, "*" is
>>>>>>>>>>>>>>>> unnecessary when specifying
"Table column names". If you leave it blank it
>>>>>>>>>>>>>>>> will import all columns by
default.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> -Abe
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Tue, Sep 3, 2013 at 3:03
AM, 陳彥廷(Yanting Chen) <
>>>>>>>>>>>>>>>> mx.alexender@gmail.com>
wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> According to the official
guide,
>>>>>>>>>>>>>>>>> http://sqoop.apache.org/docs/1.99.2/Sqoop5MinutesDemo.html
,
>>>>>>>>>>>>>>>>> I successfully created
a job.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> However, when I executed
the command, submission start
>>>>>>>>>>>>>>>>> --jid 1, I got this error
message: "Exception has occurred during
>>>>>>>>>>>>>>>>> processing command Server
has returned exception: Exception:
>>>>>>>>>>>>>>>>> java.lang.Throwable Message:
GENERIC_JDBC_CONNECTOR_0002:Unable to execute
>>>>>>>>>>>>>>>>> the SQL statement"
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> This is the information
of my job.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Database configuration
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Schema name: invoice
>>>>>>>>>>>>>>>>> Table name: ds_msg_log
>>>>>>>>>>>>>>>>> Table SQL statement:
>>>>>>>>>>>>>>>>> Table column names: *
>>>>>>>>>>>>>>>>> Partition column name:
>>>>>>>>>>>>>>>>> Boundary query:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Output configuration
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Storage type: HDFS
>>>>>>>>>>>>>>>>> Output format: TEXT_FILE
>>>>>>>>>>>>>>>>> Output directory: /user/root/ds_msg_log
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Throttling resources
Extractors: Loaders:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Since there is no information
in the official guide
>>>>>>>>>>>>>>>>> talking about how to
set the values above, does any know anything wrong in
>>>>>>>>>>>>>>>>> my job setting?
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> This is the log: Stack
trace: at
>>>>>>>>>>>>>>>>> org.apache.sqoop.connector.jdbc.GenericJdbcExecutor
>>>>>>>>>>>>>>>>> (GenericJdbcExecutor.java:59)
>>>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>>>> org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer
>>>>>>>>>>>>>>>>> (GenericJdbcImportInitializer.java:155)
>>>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>>>> org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer
>>>>>>>>>>>>>>>>> (GenericJdbcImportInitializer.java:48)
>>>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>>>> org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer
>>>>>>>>>>>>>>>>> (GenericJdbcImportInitializer.java:37)
>>>>>>>>>>>>>>>>> at org.apache.sqoop.framework.FrameworkManager
>>>>>>>>>>>>>>>>> (FrameworkManager.java:447)
>>>>>>>>>>>>>>>>> at org.apache.sqoop.handler.SubmissionRequestHandler
>>>>>>>>>>>>>>>>> (SubmissionRequestHandler.java:112)
>>>>>>>>>>>>>>>>> at org.apache.sqoop.handler.SubmissionRequestHandler
>>>>>>>>>>>>>>>>> (SubmissionRequestHandler.java:98)
>>>>>>>>>>>>>>>>> at org.apache.sqoop.handler.SubmissionRequestHandler
>>>>>>>>>>>>>>>>> (SubmissionRequestHandler.java:68)
>>>>>>>>>>>>>>>>> at org.apache.sqoop.server.v1.SubmissionServlet
>>>>>>>>>>>>>>>>> (SubmissionServlet.java:44)
>>>>>>>>>>>>>>>>> at org.apache.sqoop.server.SqoopProtocolServlet
>>>>>>>>>>>>>>>>> (SqoopProtocolServlet.java:63)
>>>>>>>>>>>>>>>>> at javax.servlet.http.HttpServlet
(HttpServlet.java:637)
>>>>>>>>>>>>>>>>> at javax.servlet.http.HttpServlet
(HttpServlet.java:717)
>>>>>>>>>>>>>>>>> at org.apache.catalina.core.ApplicationFilterChain
>>>>>>>>>>>>>>>>> (ApplicationFilterChain.java:290)
>>>>>>>>>>>>>>>>> at org.apache.catalina.core.ApplicationFilterChain
>>>>>>>>>>>>>>>>> (ApplicationFilterChain.java:206)
>>>>>>>>>>>>>>>>> at org.apache.catalina.core.StandardWrapperValve
>>>>>>>>>>>>>>>>> (StandardWrapperValve.java:233)
>>>>>>>>>>>>>>>>> at org.apache.catalina.core.StandardContextValve
>>>>>>>>>>>>>>>>> (StandardContextValve.java:191)
>>>>>>>>>>>>>>>>> at org.apache.catalina.core.StandardHostValve
>>>>>>>>>>>>>>>>> (StandardHostValve.java:127)
>>>>>>>>>>>>>>>>> at org.apache.catalina.valves.ErrorReportValve
>>>>>>>>>>>>>>>>> (ErrorReportValve.java:102)
>>>>>>>>>>>>>>>>> at org.apache.catalina.core.StandardEngineValve
>>>>>>>>>>>>>>>>> (StandardEngineValve.java:109)
>>>>>>>>>>>>>>>>> at org.apache.catalina.connector.CoyoteAdapter
>>>>>>>>>>>>>>>>> (CoyoteAdapter.java:293)
>>>>>>>>>>>>>>>>> at org.apache.coyote.http11.Http11Processor
>>>>>>>>>>>>>>>>> (Http11Processor.java:859)
>>>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>>>> org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler
>>>>>>>>>>>>>>>>> (Http11Protocol.java:602)
>>>>>>>>>>>>>>>>> at org.apache.tomcat.util.net.JIoEndpoint$Worker
>>>>>>>>>>>>>>>>> (JIoEndpoint.java:489)
>>>>>>>>>>>>>>>>> at java.lang.Thread (Thread.java:724)
>>>>>>>>>>>>>>>>> Caused by: Exception:
java.lang.Throwable Message: ERROR:
>>>>>>>>>>>>>>>>> schema "invoice" does
not exist Position: 46 Stack trace: at
>>>>>>>>>>>>>>>>> org.postgresql.core.v3.QueryExecutorImpl
(QueryExecutorImpl.java:2102)
>>>>>>>>>>>>>>>>> at org.postgresql.core.v3.QueryExecutorImpl
>>>>>>>>>>>>>>>>> (QueryExecutorImpl.java:1835)
>>>>>>>>>>>>>>>>> at org.postgresql.core.v3.QueryExecutorImpl
>>>>>>>>>>>>>>>>> (QueryExecutorImpl.java:257)
>>>>>>>>>>>>>>>>> at org.postgresql.jdbc2.AbstractJdbc2Statement
>>>>>>>>>>>>>>>>> (AbstractJdbc2Statement.java:500)
>>>>>>>>>>>>>>>>> at org.postgresql.jdbc2.AbstractJdbc2Statement
>>>>>>>>>>>>>>>>> (AbstractJdbc2Statement.java:374)
>>>>>>>>>>>>>>>>> at org.postgresql.jdbc2.AbstractJdbc2Statement
>>>>>>>>>>>>>>>>> (AbstractJdbc2Statement.java:254)
>>>>>>>>>>>>>>>>> at org.apache.sqoop.connector.jdbc.GenericJdbcExecutor
>>>>>>>>>>>>>>>>> (GenericJdbcExecutor.java:56)
>>>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>>>> org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer
>>>>>>>>>>>>>>>>> (GenericJdbcImportInitializer.java:155)
>>>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>>>> org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer
>>>>>>>>>>>>>>>>> (GenericJdbcImportInitializer.java:48)
>>>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>>>> org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer
>>>>>>>>>>>>>>>>> (GenericJdbcImportInitializer.java:37)
>>>>>>>>>>>>>>>>> at org.apache.sqoop.framework.FrameworkManager
>>>>>>>>>>>>>>>>> (FrameworkManager.java:447)
>>>>>>>>>>>>>>>>> at org.apache.sqoop.handler.SubmissionRequestHandler
>>>>>>>>>>>>>>>>> (SubmissionRequestHandler.java:112)
>>>>>>>>>>>>>>>>> at org.apache.sqoop.handler.SubmissionRequestHandler
>>>>>>>>>>>>>>>>> (SubmissionRequestHandler.java:98)
>>>>>>>>>>>>>>>>> at org.apache.sqoop.handler.SubmissionRequestHandler
>>>>>>>>>>>>>>>>> (SubmissionRequestHandler.java:68)
>>>>>>>>>>>>>>>>> at org.apache.sqoop.server.v1.SubmissionServlet
>>>>>>>>>>>>>>>>> (SubmissionServlet.java:44)
>>>>>>>>>>>>>>>>> at org.apache.sqoop.server.SqoopProtocolServlet
>>>>>>>>>>>>>>>>> (SqoopProtocolServlet.java:63)
>>>>>>>>>>>>>>>>> at javax.servlet.http.HttpServlet
(HttpServlet.java:637)
>>>>>>>>>>>>>>>>> at javax.servlet.http.HttpServlet
(HttpServlet.java:717)
>>>>>>>>>>>>>>>>> at org.apache.catalina.core.ApplicationFilterChain
>>>>>>>>>>>>>>>>> (ApplicationFilterChain.java:290)
>>>>>>>>>>>>>>>>> at org.apache.catalina.core.ApplicationFilterChain
>>>>>>>>>>>>>>>>> (ApplicationFilterChain.java:206)
>>>>>>>>>>>>>>>>> at org.apache.catalina.core.StandardWrapperValve
>>>>>>>>>>>>>>>>> (StandardWrapperValve.java:233)
>>>>>>>>>>>>>>>>> at org.apache.catalina.core.StandardContextValve
>>>>>>>>>>>>>>>>> (StandardContextValve.java:191)
>>>>>>>>>>>>>>>>> at org.apache.catalina.core.StandardHostValve
>>>>>>>>>>>>>>>>> (StandardHostValve.java:127)
>>>>>>>>>>>>>>>>> at org.apache.catalina.valves.ErrorReportValve
>>>>>>>>>>>>>>>>> (ErrorReportValve.java:102)
>>>>>>>>>>>>>>>>> at org.apache.catalina.core.StandardEngineValve
>>>>>>>>>>>>>>>>> (StandardEngineValve.java:109)
>>>>>>>>>>>>>>>>> at org.apache.catalina.connector.CoyoteAdapter
>>>>>>>>>>>>>>>>> (CoyoteAdapter.java:293)
>>>>>>>>>>>>>>>>> at org.apache.coyote.http11.Http11Processor
>>>>>>>>>>>>>>>>> (Http11Processor.java:859)
>>>>>>>>>>>>>>>>> at
>>>>>>>>>>>>>>>>> org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler
>>>>>>>>>>>>>>>>> (Http11Protocol.java:602)
>>>>>>>>>>>>>>>>> at org.apache.tomcat.util.net.JIoEndpoint$Worker
>>>>>>>>>>>>>>>>> (JIoEndpoint.java:489)
>>>>>>>>>>>>>>>>> at java.lang.Thread (Thread.java:724)
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Mime
View raw message