sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Venkat <venkat...@gmail.com>
Subject Re: Exporting hive table data into oracle give date format error
Date Thu, 21 Mar 2013 18:30:22 GMT
Hi Ajit/Jarcec

I think the whole null string handling seems to need a special section in
the documentation (witness the discussionso n Netezza null hnalding for
direct loads that we had)

We may have to come up with a specific recommendations on consistently
handling NULL in all cases (String, non-string cases).   From the database
perspectives, different DBs have different behavior on what they support
and not support for null string (it differs by versions also).

And the hive handling is also another potential issue that you have
explained.

Venkat


On Thu, Mar 21, 2013 at 8:33 AM, Jarek Jarcec Cecho <jarcec@apache.org>wrote:

> Hi Ajit,
> let me try to explain what I think is happening in your use case. There
> are multiple moving pieces, so let me firstly summarize couple of behaviour
> characteristics of the components:
>
> 1) Sqoop by default will use string "null" (lower case) to encode NULL
> values from database. This can be changed via --(input-)null-(non-)string
> arguments.
>
> 2) Hive by default uses \N for encoding NULL value.
>
> 3) When parsing input file, Hive will use NULL in case that it fails to
> read some value rather than throwing exception and killing your query.
>
> Now let's specifically focus on your work flow. To make the explanation a
> bit simpler, let's consider table "create table example(i int, t
> varchar(50));" with one single row where each column is NULL.
>
> a) Sqooping in this table without custom --null-(non-)string argument will
> lead to HDFS file with exactly line (one input row) where both columns will
> be encoded as 'null' (default substitution string for NULL values). Result:
>
>   null,null
>
> b) Executing simple "select * from example" in Hive will lead to following
> output row:
>
>   NULL null
>
> Let me explain what is happening here a bit more. Hive will read input
> file and split it into columns. The first column is of type "integer" and
> contains value "null". As string constant "null" is not a valid number for
> integer column, this value is converted into NULL. Second column is of type
> string,  constant "null" is fully valid string and thus this string is
> returned - there is no conversion to NULL value!
>
> c) Exporting table "example" will work correctly as the file on HDFS still
> contains expected "null,null".
>
> d) Now let's explore what will happen during creation of second table with
> query "CREATE TABLE example2 AS SELECT * FROM example". As a part of the
> query Hive will read all input rows and parse their values as was described
> in b). Output will be serialized into output table example2. First column
> was parsed as NULL, so it will be written out as \N (default NULL
> substitution character for Hive). Second column was however parsed as a
> valid string value and thus it will be serialized "as is". Resulting in
> file with one single line "\N,null". Please notice that this select
> statement has changed the on disk data!
>
> e) Exporting table "example2" can't obviously lead to consistent state as
> the input file has been changed.
>
> Please do not hesitate to contact me if you still have any open questions!
>
> Jarcec
>
> On Thu, Mar 21, 2013 at 06:34:35AM +0000, Ajit Kumar Shreevastava wrote:
> > Hi Jarek,
> >
> >
> >
> > Thanks a lot. Its working fine.
> >
> > In both the case in the chain mail.
> >
> >
> >
> > I also want to know the reason for SQOOP behavior for the data its
> imported and exporting from oracle without the arguments --null-string
> '\\N', --null-non-string '\\N' on import job.
> >
> >
> >
> > When I import the data without the arguments --null-string '\\N',
> --null-non-string '\\N' on import job to a Hive table. Then I am able to
> export successfully back to the Oracle table without any error and data
> mismatch. But copied the same table structure and data to another hive
> table I am not able to do so. Is there any reason for that? Is SQOOP store
> its data definition or data-formatting for the same?  If you explained the
> internal behavior of SQOOP to clear my concept for importing and exporting
> the table from a relation database to Hive and vice versa . I have
> explained the scenario in my chained mail for your reference and also
> highlighted the facts below.
> >
> >
> >
> > I am looking for your valuable comments on the below highlighted
> scenario.
> >
> >
> >
> > Thanking You,
> >
> > Regards'
> >
> > Ajit
> >
> >
> >
> >
> >
> > -----Original Message-----
> > From: Jarek Jarcec Cecho [mailto:jarcec@apache.org]
> > Sent: Thursday, March 21, 2013 6:19 AM
> > To: user@sqoop.apache.org
> > Subject: Re: Exporting hive table data into oracle give date format error
> >
> >
> >
> > Hi Ajit,
> >
> > thank you for your nice summary. You seems to be missing Sqoop arguments
> --null-string '\\N', --null-non-string '\\N' on import job and
> --input-null-string '\\N', --input-null-non-string '\\N' on export. Would
> you mind adding them and rerunning your work flow?
> >
> >
> >
> > Jarcec
> >
> >
> >
> > On Wed, Mar 20, 2013 at 07:30:29AM +0000, Ajit Kumar Shreevastava wrote:
> >
> > > Hi Jarcec,
> >
> > >
> >
> > >
> >
> > >
> >
> > > Thanking you for your valuable input and your suggestion seems to be
> >
> > > valid. But I have some doubt about the SQOOP  behavior :-->
> >
> > >
> >
> > > 1.       If null create the confusion then some value similar to the
> below mentioned value are inserted to the oracle table with null treated as
>   string.
> >
> > >
> >
> > >      hive> select * from bttn_bkp_testing
> >
> > >
> >
> > >               > where bttn_id=39126;
> >
> > >
> >
> > >
> >
> > >
> >
> > > 39126.0 32436.0 3276.0  3.0     28.0    1.0     1.0     1.0     1.0
>   #FFFFFF #0000FF #0000FF #FFFFFF 0.0     0.0     NULL    NULL    1.0
> 1.0     1.0     NULL   null     20.0    2010-05-04 14:31:17.0   dbmigration
>     2013-01-18 09:11:18.37  DP_CQ4540       2010-11-29 15:45:03.976
> ei009724        1.0     null    NULL    null   NULL     0.0     61253.0
> 61124.0 61124.0 61253.0
> >
> > >
> >
> > > 39126.0 50805.0 3276.0  3.0     28.0    1.0     1.0     1.0     1.0
>   #FFFFFF #0000FF #0000FF #FFFFFF 0.0     0.0     NULL    NULL    1.0
> 1.0     1.0     NULL   null     20.0    2010-05-23 23:18:54.604 ei103215
>      2013-01-18 09:11:18.37  DP_CQ4540       2010-11-29 15:45:03.976
> ei009724        1.0     null    NULL    null   NULL     0.0     61253.0
> 61124.0 61124.0 61253.0
> >
> > >
> >
> > > 39126.0 63196.0 3276.0  3.0     28.0    1.0     1.0     1.0     1.0
>   #FFFFFF #0000FF #0000FF #FFFFFF 0.0     0.0     NULL    NULL    1.0
> 1.0     1.0     NULL   null     20.0    2010-11-04 18:25:23.956 ei103215
>      2013-01-18 09:11:18.37  DP_CQ4540       2010-11-29 15:45:03.976
> ei009724        1.0     null    NULL    null   NULL     0.0     61253.0
> 61124.0 61124.0 61253.0
> >
> > >
> >
> > >
> >
> > >
> >
> > > These values are inserted into the oracle table BTTN_BKP_TEST as
> follows:--> .
> >
> > >
> >
> > > SQL> Select * from BTTN_BKP_TEST where bttn_id=39126;
> >
> > >
> >
> > >
> >
> > >
> >
> > > 39126    32436    3276       3              28           1
>  1              1              1              #FFFFFF               #0000FF
>               #0000FF                #FFFFFF               0              0
>                                              1              1
>  1                              null         20           05/04/2010
> 2:31:17.000000 PM          dbmigration       01/18/2013 9:11:18.370000 AM
>  DP_CQ4540        11/29/2010 3:45:03.976000 PM                ei009724
>          1              null                         null
>       0              61253    61124    61124    61253
> >
> > >
> >
> > > 39126    50805    3276       3              28           1
>  1              1              1              #FFFFFF               #0000FF
>               #0000FF                #FFFFFF               0              0
>                                              1              1
>  1                              null         20           05/23/2010
> 11:18:54.604000 PM        ei103215              01/18/2013 9:11:18.370000
> AM  DP_CQ4540        11/29/2010 3:45:03.976000 PM                ei009724
>            1              null                         null
>         0              61253    61124    61124    61253
> >
> > >
> >
> > > 39126    63196    3276       3              28           1
>  1              1              1              #FFFFFF               #0000FF
>               #0000FF                #FFFFFF               0              0
>                                              1              1
>  1                              null         20           11/04/2010
> 6:25:23.956000 PM          ei103215              01/18/2013 9:11:18.370000
> AM  DP_CQ4540        11/29/2010 3:45:03.976000 PM                ei009724
>            1              null                         null
>         0              61253    61124    61124    61253
> >
> > >
> >
> > >
> >
> > >
> >
> > > But the raised exception for below value:-->
> >
> > >
> >
> > > hive> select * from bttn_bkp_testing
> >
> > >
> >
> > >         > where bttn_id= 194628.0;
> >
> > >
> >
> > >
> >
> > >
> >
> > > 194628.0        577019.0        8910.0  19.0    1.0     1.0     1.0
>   0.0     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0
>  NULL    NULL    NULL   1.0      NULL    null    20.0    2012-04-19
> 23:25:48.78  ei009724        2013-01-18 09:11:30.245 DP_CQ4540       null
>  null    0.0     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0
> 61259.0 61230.0 61230.0 61259.0
> >
> > >
> >
> > > 194628.0        706360.0        8910.0  19.0    1.0     1.0     1.0
>   0.0     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0
>  NULL    NULL    NULL   1.0      NULL    null    20.0    2012-05-21
> 01:01:53.629 ei103215        2013-01-18 09:11:30.245 DP_CQ4540       null
>  null    0.0     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0
> 61259.0 61230.0 61230.0 61259.0
> >
> > >
> >
> > > 194628.0        1620395.0       8910.0  19.0    1.0     1.0     1.0
>   0.0     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0
>  NULL    NULL    NULL   1.0      NULL    null    20.0    2012-08-10
> 04:34:00.203 ei103215        2013-01-18 09:11:30.245 DP_CQ4540       null
>  null    0.0     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0
> 61259.0 61230.0 61230.0 61259.0
> >
> > >
> >
> > > 194628.0        1694103.0       8910.0  19.0    1.0     1.0     1.0
>   0.0     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0
>  NULL    NULL    NULL   1.0      NULL    null    20.0    2012-11-08
> 01:09:15.136 ei103215        2013-01-18 09:11:30.245 DP_CQ4540       null
>  null    0.0     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0
> 61259.0 61230.0 61230.0 61259.0
> >
> > >
> >
> > > 194628.0        1831767.0       8910.0  19.0    1.0     1.0     1.0
>   0.0     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0
>  NULL    NULL    NULL   1.0      NULL    null    20.0    2012-12-19
> 23:44:44.241 e0025129        2013-01-18 09:11:30.245 DP_CQ4540       null
>  null    0.0     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0
> 61259.0 61230.0 61230.0 61259.0
> >
> > >
> >
> > >
> >
> > >
> >
> > > 2.       For your information I have mentioned two interesting fact
> here for you regarding SQOOP behavior. First, I have imported Bttn table
> from Oracle into Hive bttn_bkp_test_new table using following command
> >
> > >
> >
> > > [hadoop@NHCLT-PC44-2 ~]$ sqoop import --connect
> >
> > > jdbc:oracle:thin:@10.99.42.11:1521/clouddb --username HDFSUSER
> >
> > > --table BTTN --verbose -P --hive-table bttn_bkp_test_new
> >
> > > --create-hive-table --hive-import --hive-drop-import-delims
> >
> > > --hive-home /home/hadoop/user/hive/warehouse
> >
> > >
> >
> > > And the above command imports all the rows into hive  table
> bttn_bkp_test_new and SQOOP created some value with null and some with NULL.
> >
> > >
> >
> > >
> >
> > >
> >
> > > Now I have created a new table Bttn_bkp_test in oracle Database and
> >
> > > try to export the above created hive table bttn_bkp_test_new into
> >
> > > Oracle table Bttn_bkp_test  :-->
> >
> > >
> >
> > > [hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect
> >
> > > jdbc:oracle:thin:@10.99.42.11:1521/clouddb --username HDFSUSER
> >
> > > --table BTTN_BKP_TEST --export-dir
> >
> > > /home/hadoop/user/hive/warehouse/bttn_bkp_test_new -P -m 1
> >
> > > --input-fields-terminated-by '\0001' -verbose
> >
> > >
> >
> > >
> >
> > >
> >
> > > Now all data inserted properly into BTTN_BKP_TEST and the null value
> in hive table is inserted as null value not with "null" string  ( All data
> are similar to the old Bttn table data of Oracle).
> >
> > >
> >
> > >
> >
> > >
> >
> > > Now, I am created a new table in HIVE  with the following command:-->
> >
> > > hive> create table bttn_bkp_testing like bttn_bkp_test_new; insert
> >
> > > hive> OVERWRITE table bttn_bkp_testing
> >
> > >
> >
> > >     > select * from bttn_bkp_test_new
> >
> > >
> >
> > >
> >
> > >
> >
> > > Now I am putting two scenario for you:-->
> >
> > >
> >
> > >
> >
> > >
> >
> > > a.       Now  i have truncated the bttn_bkp_test table in oracle and
> try to repopulate this table with new hive table bttn_bkp_testing which is
> just created from bttn_bkp_test_new with following command:-->
> >
> > > [hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect
> >
> > > jdbc:oracle:thin:@10.99.42.11:1521/clouddb --username HDFSUSER
> >
> > > --table BTTN_BKP_TEST --export-dir
> >
> > > /home/hadoop/user/hive/warehouse/bttn_bkp_testing -P -m 1
> >
> > > --input-fields-terminated-by '\0001' -verbose --update-key
> >
> > > BTTN_ID,DATA_INST_ID,SCR_ID --update-mode allowinsert
> >
> > >
> >
> > >
> >
> > >
> >
> > > And I got below error error:-->
> >
> > >
> >
> > > 13/03/20 12:13:39 DEBUG mapreduce.ExportInputFormat:
> Paths:/home/hadoop/user/hive/warehouse/bttn_bkp_testing/000000_0:0+67108864,/home/hadoop/user/hive/warehouse/bttn_bkp_testing/000000_0:67108864+67108864,/home/hadoop/user/hive/warehouse/bttn_bkp_testing/000000_0:134217728+65312499
> Locations:NHCLT-PC44-2.hclt.corp.hcl.in:;
> >
> > >
> >
> > > 13/03/20 12:13:39 INFO mapred.JobClient: Running job:
> >
> > > job_201303191912_0005
> >
> > >
> >
> > > 13/03/20 12:13:40 INFO mapred.JobClient:  map 0% reduce 0%
> >
> > >
> >
> > > 13/03/20 12:13:52 INFO mapred.JobClient: Task Id :
> >
> > > attempt_201303191912_0005_m_000000_0, Status : FAILED
> >
> > >
> >
> > > java.io.IOException: Can't export data, please check task tracker logs
> >
> > >
> >
> > >         at
> >
> > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:
> >
> > > 112)
> >
> > >
> >
> > >         at
> >
> > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:
> >
> > > 39)
> >
> > >
> >
> > >         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> >
> > >
> >
> > >         at
> >
> > > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.j
> >
> > > ava:64)
> >
> > >
> >
> > >         at
> >
> > > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> >
> > >
> >
> > >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >
> > >
> >
> > >         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> >
> > >
> >
> > >         at java.security.AccessController.doPrivileged(Native Method)
> >
> > >
> >
> > >         at javax.security.auth.Subject.doAs(Subject.java:396)
> >
> > >
> >
> > >         at
> >
> > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformat
> >
> > > ion.java:1121)
> >
> > >
> >
> > >         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> >
> > >
> >
> > > Caused by: java.lang.NumberFormatException
> >
> > >
> >
> > >         at java.math.BigDecimal.<init>(BigDecimal.java:459)
> >
> > >
> >
> > >         at java.math.BigDecimal.<init>(BigDecimal.java:728)
> >
> > >
> >
> > >         at BTTN_BKP_TEST.__loadFromFields(BTTN_BKP_TEST.java:1314)
> >
> > >
> >
> > >         at BTTN_BKP_TEST.parse(BTTN_BKP_TEST.java:1191)
> >
> > >
> >
> > >         at
> >
> > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:
> >
> > > 83)
> >
> > >
> >
> > >         ... 10 more
> >
> > >
> >
> > >
> >
> > >
> >
> > > I am attaching Maper log here (syslog_for_first_export).
> >
> > >
> >
> > > In this mapper log  I can see that input file value is  null.
> >
> > >
> >
> > >
> >
> > >
> >
> > > Any Idea why its behave like above.
> >
> > >
> >
> > >
> >
> > >
> >
> > > b.      2nd Scenario for you :-->
> >
> > >
> >
> > > Now  i have truncated the bttn_bkp_test table in oracle and try to
> >
> > > repopulate this table with new hive table bttn_bkp_testing which is
> >
> > > just created from bttn_bkp_test_new with following command:-->
> >
> > >
> >
> > > [hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect
> jdbc:oracle:thin:@10.99.42.11:1521/clouddb --username HDFSUSER  --table
> BTTN_BKP_TEST --export-dir
>  /home/hadoop/user/hive/warehouse/bttn_bkp_testing -P -m 1
>  --input-fields-terminated-by '\0001' -verbose --update-key
> BTTN_ID,DATA_INST_ID,SCR_ID --update-mode allowinsert --input-null-string
> '\\N' --input-null-non-string '\\N'
> >
> > >
> >
> > >
> >
> > >
> >
> > > And I got below error error:-->
> >
> > >
> >
> > > 13/03/20 12:41:58 DEBUG mapreduce.ExportInputFormat:
> Paths:/home/hadoop/user/hive/warehouse/bttn_bkp_testing/000000_0:0+67108864,/home/hadoop/user/hive/warehouse/bttn_bkp_testing/000000_0:67108864+67108864,/home/hadoop/user/hive/warehouse/bttn_bkp_testing/000000_0:134217728+65312499
> Locations:NHCLT-PC44-2.hclt.corp.hcl.in:;
> >
> > >
> >
> > > 13/03/20 12:41:58 INFO mapred.JobClient: Running job:
> >
> > > job_201303191912_0007
> >
> > >
> >
> > > 13/03/20 12:41:59 INFO mapred.JobClient:  map 0% reduce 0%
> >
> > >
> >
> > > 13/03/20 12:42:15 INFO mapred.JobClient:  map 6% reduce 0%
> >
> > >
> >
> > > 13/03/20 12:42:18 INFO mapred.JobClient:  map 11% reduce 0%
> >
> > >
> >
> > > 13/03/20 12:42:21 INFO mapred.JobClient:  map 17% reduce 0%
> >
> > >
> >
> > > 13/03/20 12:42:24 INFO mapred.JobClient:  map 22% reduce 0%
> >
> > >
> >
> > > 13/03/20 12:42:27 INFO mapred.JobClient:  map 27% reduce 0%
> >
> > >
> >
> > > 13/03/20 12:42:30 INFO mapred.JobClient:  map 33% reduce 0%
> >
> > >
> >
> > > 13/03/20 12:42:33 INFO mapred.JobClient:  map 35% reduce 0%
> >
> > >
> >
> > > 13/03/20 12:42:36 INFO mapred.JobClient:  map 39% reduce 0%
> >
> > >
> >
> > > 13/03/20 12:42:39 INFO mapred.JobClient:  map 44% reduce 0%
> >
> > >
> >
> > > 13/03/20 12:42:42 INFO mapred.JobClient:  map 46% reduce 0%
> >
> > >
> >
> > > 13/03/20 12:42:45 INFO mapred.JobClient:  map 51% reduce 0%
> >
> > >
> >
> > > 13/03/20 12:42:48 INFO mapred.JobClient:  map 56% reduce 0%
> >
> > >
> >
> > > 13/03/20 12:42:51 INFO mapred.JobClient:  map 62% reduce 0%
> >
> > >
> >
> > > 13/03/20 12:42:54 INFO mapred.JobClient:  map 65% reduce 0%
> >
> > >
> >
> > > 13/03/20 12:42:59 INFO mapred.JobClient: Task Id :
> >
> > > attempt_201303191912_0007_m_000000_0, Status : FAILED
> >
> > >
> >
> > > java.io.IOException: Can't export data, please check task tracker logs
> >
> > >
> >
> > >         at
> >
> > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:
> >
> > > 112)
> >
> > >
> >
> > >         at
> >
> > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:
> >
> > > 39)
> >
> > >
> >
> > >         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> >
> > >
> >
> > >         at
> >
> > > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.j
> >
> > > ava:64)
> >
> > >
> >
> > >         at
> >
> > > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> >
> > >
> >
> > >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >
> > >
> >
> > >         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> >
> > >
> >
> > >         at java.security.AccessController.doPrivileged(Native Method)
> >
> > >
> >
> > >         at javax.security.auth.Subject.doAs(Subject.java:396)
> >
> > >
> >
> > >         at
> >
> > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformat
> >
> > > ion.java:1121)
> >
> > >
> >
> > >         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> >
> > >
> >
> > > Caused by: java.lang.IllegalArgumentException: Timestamp format must
> >
> > > be yyyy-mm-dd hh:mm:ss[.fffffffff]
> >
> > >
> >
> > >         at java.sql.Timestamp.valueOf(Timestamp.java:185)
> >
> > >
> >
> > >         at BTTN_BKP_TEST.__loadFromFields(BTTN_BKP_TEST.java:1374)
> >
> > >
> >
> > >         at BTTN_BKP_TEST.parse(BTTN_BKP_TEST.java:1191)
> >
> > >
> >
> > >         at
> >
> > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:
> >
> > > 83)
> >
> > >
> >
> > >         ... 10 more
> >
> > >
> >
> > >
> >
> > >
> >
> > > I am attaching Maper log here (syslog_for_2nd_export).
> >
> > >
> >
> > > In this mapper log  I can see that input file value is  On input file:
> /home/hadoop/user/hive/warehouse/bttn_bkp_testing/000000.
> >
> > >
> >
> > >
> >
> > >
> >
> > > Here I can see the null in Hive table bttn_bkp_testing is inserted as
> "null" string in Oracle table BTTN_BKP_TEST.
> >
> > >
> >
> > >
> >
> > >
> >
> > > hive> select * from bttn_bkp_testing
> >
> > >
> >
> > >               > where bttn_id=39126;
> >
> > >
> >
> > > 39126.0 32436.0 3276.0  3.0     28.0    1.0     1.0     1.0     1.0
>   #FFFFFF #0000FF #0000FF #FFFFFF 0.0     0.0     NULL    NULL    1.0
> 1.0     1.0     NULL   null     20.0    2010-05-04 14:31:17.0   dbmigration
>     2013-01-18 09:11:18.37  DP_CQ4540       2010-11-29 15:45:03.976
> ei009724        1.0     null    NULL    null   NULL     0.0     61253.0
> 61124.0 61124.0 61253.0
> >
> > >
> >
> > > 39126.0 50805.0 3276.0  3.0     28.0    1.0     1.0     1.0     1.0
>   #FFFFFF #0000FF #0000FF #FFFFFF 0.0     0.0     NULL    NULL    1.0
> 1.0     1.0     NULL   null     20.0    2010-05-23 23:18:54.604 ei103215
>      2013-01-18 09:11:18.37  DP_CQ4540       2010-11-29 15:45:03.976
> ei009724        1.0     null    NULL    null   NULL     0.0     61253.0
> 61124.0 61124.0 61253.0
> >
> > >
> >
> > > 39126.0 63196.0 3276.0  3.0     28.0    1.0     1.0     1.0     1.0
>   #FFFFFF #0000FF #0000FF #FFFFFF 0.0     0.0     NULL    NULL    1.0
> 1.0     1.0     NULL   null     20.0    2010-11-04 18:25:23.956 ei103215
>      2013-01-18 09:11:18.37  DP_CQ4540       2010-11-29 15:45:03.976
> ei009724        1.0     null    NULL    null   NULL     0.0     61253.0
> 61124.0 61124.0 61253.0
> >
> > >
> >
> > >
> >
> > >
> >
> > > These values are inserted into the oracle table BTTN_BKP_TEST as
> follows:--> .
> >
> > >
> >
> > > SQL> Select * from BTTN_BKP_TEST where bttn_id=39126;
> >
> > >
> >
> > >
> >
> > >
> >
> > > 39126            32436    3276       3              28           1
>          1              1              1              #FFFFFF
> #0000FF        #0000FF               #FFFFFF               0              0
>                                              1              1
>  1                              null         20        05/04/2010
> 2:31:17.000000 PM   dbmigration       01/18/2013 9:11:18.370000 AM
>  DP_CQ4540        11/29/2010 3:45:03.976000 PM  ei009724              1
>          null                         null                         0
>        61253    61124    61124        61253
> >
> > >
> >
> > > 39126            50805    3276       3              28           1
>          1              1              1              #FFFFFF
> #0000FF        #0000FF               #FFFFFF               0              0
>                                              1              1
>  1                              null         20        05/23/2010
> 11:18:54.604000 PM                ei103215              01/18/2013
> 9:11:18.370000 AM  DP_CQ4540        11/29/2010 3:45:03.976000 PM   ei009724
>              1              null                         null
>           0              61253        61124    61124    61253
> >
> > >
> >
> > > 39126            63196    3276       3              28           1
>          1              1              1              #FFFFFF
> #0000FF        #0000FF               #FFFFFF               0              0
>                                              1              1
>  1                              null         20        11/04/2010
> 6:25:23.956000 PM   ei103215              01/18/2013 9:11:18.370000 AM
>  DP_CQ4540        11/29/2010 3:45:03.976000 PM  ei009724              1
>          null                         null                         0
>        61253    61124    61124        61253
> >
> > >
> >
> > >
> >
> > >
> >
> > >
> >
> > >
> >
> > > Looking for your valuable suggestion for the above facts.
> >
> > >
> >
> > > Is this a bug in SQOOP?
> >
> > >
> >
> > >
> >
> > >
> >
> > > Regards,
> >
> > >
> >
> > > Ajit
> >
> > >
> >
> > >
> >
> > >
> >
> > >
> >
> > >
> >
> > > -----Original Message-----
> >
> > > From: Jarek Jarcec Cecho [mailto:jarcec@apache.org]
> >
> > > Sent: Wednesday, March 20, 2013 2:56 AM
> >
> > > To: user@sqoop.apache.org
> >
> > > Subject: Re: Exporting hive table data into oracle give date format
> >
> > > error
> >
> > >
> >
> > >
> >
> > >
> >
> > > Hi Ajit,
> >
> > >
> >
> > > thank you for sharing the additional data. I've noticed in your data
> that some of the columns are using \N to denote the NULL value, however
> some other columns are using string constant "null" (that do not denote
> NULL in Hive). This also seems to be the case for column DEL_TS. My guess
> is that Sqoop is trying to decode the "null" string as the timestamp and
> failing on the "Timestamp format must be..." exception. I would recommend
> to unify the null representation tokens and run Sqoop export with
> appropriate one.
> >
> > >
> >
> > >
> >
> > >
> >
> > > Jarcec
> >
> > >
> >
> > >
> >
> > >
> >
> > > On Tue, Mar 19, 2013 at 08:13:01AM +0000, Ajit Kumar Shreevastava
> wrote:
> >
> > >
> >
> > > > Hi Jercec,
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > Thank you for your valuable suggestions.
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > I have applied the below suggestion and re-do all the process again
> with the SQOOP1.4.3 (sqoop-1.4.3.bin__hadoop-1.0.0.tar.gz) but I have face
> same below error again. Please suggest me.
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > Here I have created table in hive as  suggested by you.
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > hive> create table bttn_bkp_testing like bttn_bkp_test;
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > hive> insert OVERWRITE table bttn_bkp_testing
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >         > select * from bttn_bkp_test;
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > I am also attaching the error file generated by task tracker for
> your analysis.
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > It fails for bttn_id = 194628
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > I have queried both the table and records are like
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > hive> select * from bttn_bkp_testing
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >     > where bttn_id=194628;
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > 194628.0        577019.0        8910.0  19.0    1.0     1.0     1.0
>     0.0     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0
>  NULL    NULL    NULL   1.0      NULL    null    20.0    2012-04-19
> 23:25:48.78  ei009724        2013-01-18 09:11:30.245 DP_CQ4540       null
>  null    0.0     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0
> 61259.0 61230.0 61230.0 61259.0
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > 194628.0        706360.0        8910.0  19.0    1.0     1.0     1.0
>     0.0     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0
>  NULL    NULL    NULL   1.0      NULL    null    20.0    2012-05-21
> 01:01:53.629 ei103215        2013-01-18 09:11:30.245 DP_CQ4540       null
>  null    0.0     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0
> 61259.0 61230.0 61230.0 61259.0
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > 194628.0        1620395.0       8910.0  19.0    1.0     1.0     1.0
>     0.0     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0
>  NULL    NULL    NULL   1.0      NULL    null    20.0    2012-08-10
> 04:34:00.203 ei103215        2013-01-18 09:11:30.245 DP_CQ4540       null
>  null    0.0     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0
> 61259.0 61230.0 61230.0 61259.0
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > 194628.0        1694103.0       8910.0  19.0    1.0     1.0     1.0
>     0.0     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0
>  NULL    NULL    NULL   1.0      NULL    null    20.0    2012-11-08
> 01:09:15.136 ei103215        2013-01-18 09:11:30.245 DP_CQ4540       null
>  null    0.0     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0
> 61259.0 61230.0 61230.0 61259.0
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > 194628.0        1831767.0       8910.0  19.0    1.0     1.0     1.0
>     0.0     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0
>  NULL    NULL    NULL   1.0      NULL    null    20.0    2012-12-19
> 23:44:44.241 e0025129        2013-01-18 09:11:30.245 DP_CQ4540       null
>  null    0.0     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0
> 61259.0 61230.0 61230.0 61259.0
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > And
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > hive> select * from bttn_bkp_test_new
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >     > where bttn_id=194628;
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > 194628.0        577019.0        8910.0  19.0    1.0     1.0     1.0
>     0.0     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0
>  NULL    NULL    NULL   1.0      NULL    null    20.0    2012-04-19
> 23:25:48.78  ei009724        2013-01-18 09:11:30.245 DP_CQ4540       null
>  null    0.0     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0
> 61259.0 61230.0 61230.0 61259.0
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > 194628.0        706360.0        8910.0  19.0    1.0     1.0     1.0
>     0.0     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0
>  NULL    NULL    NULL   1.0      NULL    null    20.0    2012-05-21
> 01:01:53.629 ei103215        2013-01-18 09:11:30.245 DP_CQ4540       null
>  null    0.0     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0
> 61259.0 61230.0 61230.0 61259.0
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > 194628.0        1620395.0       8910.0  19.0    1.0     1.0     1.0
>     0.0     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0
>  NULL    NULL    NULL   1.0      NULL    null    20.0    2012-08-10
> 04:34:00.203 ei103215        2013-01-18 09:11:30.245 DP_CQ4540       null
>  null    0.0     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0
> 61259.0 61230.0 61230.0 61259.0
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > 194628.0        1694103.0       8910.0  19.0    1.0     1.0     1.0
>     0.0     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0
>  NULL    NULL    NULL   1.0      NULL    null    20.0    2012-11-08
> 01:09:15.136 ei103215        2013-01-18 09:11:30.245 DP_CQ4540       null
>  null    0.0     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0
> 61259.0 61230.0 61230.0 61259.0
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > 194628.0        1831767.0       8910.0  19.0    1.0     1.0     1.0
>     0.0     0.0     #FFFFFF #FF0000 #FF0000 #FFFFFF 0.0     0.0     1646.0
>  NULL    NULL    NULL   1.0      NULL    null    20.0    2012-12-19
> 23:44:44.241 e0025129        2013-01-18 09:11:30.245 DP_CQ4540       null
>  null    0.0     BLUEBERRY MUFFIN        7836.0 null     NULL    0.0
> 61259.0 61230.0 61230.0 61259.0
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > Regards,
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > Ajit Kumar Shreevastava
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > -----Original Message-----
> >
> > >
> >
> > > > From: Jarek Jarcec Cecho [mailto:jarcec@apache.org]
> >
> > >
> >
> > > > Sent: Sunday, March 17, 2013 4:29 AM
> >
> > >
> >
> > > > To: user@sqoop.apache.org<mailto:user@sqoop.apache.org>
> >
> > >
> >
> > > > Subject: Re: Exporting hive table data into oracle give date format
> >
> > > > error
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > [-CC
> >
> > > > hive@user.apache.org<mailto:hive@user.apache.org<mailto:hive@user.ap
> <mailto:hive@user.apache.org%3cmailto:hive@user.apache.org
> %3cmailto:hive@user.ap>
> >
> > > > ache.org%3cmailto:hive@user.apache.org>>]
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > Hi Ajit,
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > would you mind upgrading to Sqoop 1.4.3? We've improved the logging
> for this particular exception, so it should significantly help in
> triangulating your issue.
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > Jarcec
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > On Wed, Mar 13, 2013 at 01:43:11PM +0000, Ajit Kumar Shreevastava
> wrote:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > Hi All,
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > Can you please let me know how can I bypass this error. I am
> currently using Apache  SQOOP version 1.4.2.
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > [hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect
> jdbc:oracle:thin:@10.99.42.11:1521/clouddb --username HDFSUSER  --table
> BTTN_BKP_TEST --export-dir  /home/hadoop/user/hive/warehouse/bttn_bkp -P -m
> 1  --input-fields-terminated-by '\0001' --verbose --input-null-string '\\N'
> --input-null-non-string '\\N'
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > Please set $HBASE_HOME to the root of your HBase installation.
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:42 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > Enter password:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG sqoop.ConnFactory: Loaded manager factory:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > com.cloudera.sqoop.manager.DefaultManagerFactory
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > com.cloudera.sqoop.manager.DefaultManagerFactory
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG manager.DefaultManagerFactory: Trying with
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > scheme: jdbc:oracle:thin:@10.99.42.11
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG manager.OracleManager$ConnCache:
> Instantiated new connection cache.
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 INFO manager.SqlManager: Using default fetchSize
> >
> > > > > of
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 1000
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG sqoop.ConnFactory: Instantiated
> >
> > > > > ConnManager
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > org.apache.sqoop.manager.OracleManager@74b23210<mailto:org.apache
> <mailto:org.apache.sqoop.manager.OracleManager@74b23210
> %3cmailto:org.apache>.
> >
> > > > > sqoop.manager.OracleManager@74b23210<mailto:org.apache.sqoop.manag
> <mailto:sqoop.manager.OracleManager@74b23210
> %3cmailto:org.apache.sqoop.manag>
> >
> > > > > er.OracleManager@74b23210
> %3cmailto:org.apache.sqoop.manager.Oracle<mailto:er.OracleManager@74b23210
> %3cmailto:org.apache.sqoop.manager.Oracle>
> >
> > > > > Manager@74b23210>>
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 INFO tool.CodeGenTool: Beginning code generation
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG manager.OracleManager: Using column names
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > query: SELECT t.* FROM BTTN_BKP_TEST t WHERE 1=0
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG manager.OracleManager: Creating a new
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb, using
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > username: HDFSUSER
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG manager.OracleManager: No connection
> paramenters specified. Using regular API for making connection.
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 INFO manager.OracleManager: Time zone has been
> >
> > > > > set
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > to GMT
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG manager.SqlManager: Using fetchSize for
> >
> > > > > next
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > query: 1000
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 INFO manager.SqlManager: Executing SQL statement:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > SELECT t.* FROM BTTN_BKP_TEST t WHERE 1=0
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG manager.OracleManager$ConnCache: Caching
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > released connection for
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter: selected columns:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BTTN_ID
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   DATA_INST_ID
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   SCR_ID
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BTTN_NU
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   CAT
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   WDTH
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   HGHT
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   KEY_SCAN
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   KEY_SHFT
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BLM_FL
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   LCLZ_FL
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   MENU_ITEM_NU
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_ID
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   ON_ATVT
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   ON_CLIK
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   ENBL_FL
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BLM_SET_ID
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BTTN_ASGN_LVL_NAME
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   MKT_ID
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   CRTE_TS
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   CRTE_USER_ID
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   UPDT_TS
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   UPDT_USER_ID
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   DEL_TS
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   DEL_USER_ID
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   DLTD_FL
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   MENU_ITEM_NA
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   PRD_CD
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BLM_SET_NA
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   SOUND_FILE_ID
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   IS_DYNMC_BTTN
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_ID
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   FRGND_CPTN_COLR_PRSD_ID
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_ID
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter:   BKGD_CPTN_COLR_PRSD_ID
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter: Writing source file:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/BTTN_BK
> >
> > > > > P_TE
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > ST.java
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter: Table name: BTTN_BKP_TEST
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter: Columns: BTTN_ID:2,
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > DATA_INST_ID:2, SCR_ID:2, BTTN_NU:2, CAT:2, WDTH:2, HGHT:2,
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > KEY_SCAN:2, KEY_SHFT:2, FRGND_CPTN_COLR:12,
> >
> > > > > FRGND_CPTN_COLR_PRSD:12,
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > BKGD_CPTN_COLR:12, BKGD_CPTN_COLR_PRSD:12, BLM_FL:2, LCLZ_FL:2,
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > MENU_ITEM_NU:2, BTTN_ASGN_LVL_ID:2, ON_ATVT:2, ON_CLIK:2,
> >
> > > > > ENBL_FL:2,
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > BLM_SET_ID:2, BTTN_ASGN_LVL_NAME:12, MKT_ID:2, CRTE_TS:93,
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > CRTE_USER_ID:12, UPDT_TS:93, UPDT_USER_ID:12, DEL_TS:93,
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > DEL_USER_ID:12, DLTD_FL:2, MENU_ITEM_NA:12, PRD_CD:2,
> >
> > > > > BLM_SET_NA:12,
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > SOUND_FILE_ID:2, IS_DYNMC_BTTN:2, FRGND_CPTN_COLR_ID:2,
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > FRGND_CPTN_COLR_PRSD_ID:2, BKGD_CPTN_COLR_ID:2,
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > BKGD_CPTN_COLR_PRSD_ID:2,
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.ClassWriter: sourceFilename is
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > BTTN_BKP_TEST.java
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.CompilationManager: Found existing
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 INFO orm.CompilationManager: HADOOP_HOME is
> /home/hadoop/hadoop-1.0.3/libexec/..
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.CompilationManager: Adding source file:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/BTTN_BK
> >
> > > > > P_TE
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > ST.java
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.CompilationManager: Invoking javac
> with args:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.CompilationManager:   -sourcepath
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.CompilationManager:   -d
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.CompilationManager:   -classpath
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:47 DEBUG orm.CompilationManager:
> /home/hadoop/hadoop-1.0.3/libexec/../conf:/usr/java/jdk1.6.0_32/lib/tools.jar:/home/hadoop/hadoop-1.0.3/libexec/..:/home/hadoop/hadoop-1.0.3/libexec/../hadoop-core-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/asm-3.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjrt-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjtools-1.6.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-1.7.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-cli-1.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-codec-1.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-collections-3.2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-configuration-1.6.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-daemon-1.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-digester-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-el-1.0.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-httpclient-3.0.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-io-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-lang-2.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-1.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-api-1.0.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-math-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-net-1.4.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/core-3.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hsqldb-1.8.0.10.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-compiler-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-runtime-5.5.12.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jdeb-0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-core-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-json-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-server-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jets3t-0.6.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jetty-util-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsch-0.1.42.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/junit-4.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/kfs-0.2.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/log4j-1.2.15.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/mockito-all-1.8.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/oro-2.0.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/servlet-api-2.5-20081211.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-api-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/xmlenc-0.52.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/home/hadoop/sqoop/conf::/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/home/hadoop/sqoop/lib/avro-1.5.3.jar:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar:/home/hadoop/sqoop/lib/commons-io-1.4.jar:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar:/home/hadoop/sqoop/lib/ojdbc6.jar:/home/hadoop/sqoop/lib/paranamer-2.3.jar:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar:/home/hadoop/sqoop/sqoop-test-1.4.2.jar::/home/hadoop/hadoop-1.0.3/hadoop-core-1.0.3.jar:/home/hadoop/sqoop/sqoop-1.4.2.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > Note:
> /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/BTTN_BKP_TEST.java
> uses or overrides a deprecated API.
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > Note: Recompile with -Xlint:deprecation for details.
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:48 DEBUG orm.CompilationManager: Could not rename
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/BTTN_BK
> >
> > > > > P_TE
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > ST.java to /home/hadoop/sqoop-oper/./BTTN_BKP_TEST.java
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > org.apache.commons.io.FileExistsException: Destination
> >
> > > > > '/home/hadoop/sqoop-oper/./BTTN_BKP_TEST.java' already exists
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at
> >
> > > > > org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at
> >
> > > > > org.apache.sqoop.orm.CompilationManager.compile(CompilationManager
> >
> > > > > .java:227)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at
> >
> > > > > org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at
> >
> > > > > org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:64)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at
> >
> > > > > org.apache.sqoop.tool.ExportTool.run(ExportTool.java:97)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at
> >
> > > > > org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:48 INFO orm.CompilationManager: Writing jar file:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/BTTN_BK
> >
> > > > > P_TE
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > ST.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:48 DEBUG orm.CompilationManager: Scanning for
> >
> > > > > .class
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > files in directory:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:48 DEBUG orm.CompilationManager: Got classfile:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/BTTN_BK
> >
> > > > > P_TE
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > ST.class -> BTTN_BKP_TEST.class
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:48 DEBUG orm.CompilationManager: Finished writing
> >
> > > > > jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > file
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > /tmp/sqoop-hadoop/compile/69b6a9d2ebb99cebced808e559528531/BTTN_BK
> >
> > > > > P_TE
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > ST.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:48 INFO mapreduce.ExportJobBase: Beginning export
> >
> > > > > of
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > BTTN_BKP_TEST
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:48 DEBUG mapreduce.JobBase: Using InputFormat:
> >
> > > > > class
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > org.apache.sqoop.mapreduce.ExportInputFormat
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG manager.OracleManager$ConnCache: Got
> >
> > > > > cached
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 INFO manager.OracleManager: Time zone has been
> >
> > > > > set
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > to GMT
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG manager.OracleManager$ConnCache: Caching
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > released connection for
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > file:/home/hadoop/sqoop/sqoop-1.4.2.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > file:/home/hadoop/sqoop/lib/ojdbc6.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > file:/home/hadoop/sqoop/sqoop-1.4.2.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > file:/home/hadoop/sqoop/sqoop-1.4.2.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > file:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > file:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > file:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > file:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > file:/home/hadoop/sqoop/lib/ojdbc6.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > file:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > file:/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > file:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > file:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > file:/home/hadoop/sqoop/lib/paranamer-2.3.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > file:/home/hadoop/sqoop/lib/avro-1.5.3.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > file:/home/hadoop/sqoop/lib/commons-io-1.4.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.JobBase: Adding to job classpath:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > file:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 INFO input.FileInputFormat: Total input paths to
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > process : 1
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.ExportInputFormat: Target
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > numMapTasks=1
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.ExportInputFormat: Total input
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > bytes=172704981
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.ExportInputFormat:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > maxSplitSize=172704981
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 INFO input.FileInputFormat: Total input paths to
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > process : 1
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.ExportInputFormat: Generated
> splits:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 DEBUG mapreduce.ExportInputFormat:
> Paths:/home/hadoop/user/hive/warehouse/bttn_bkp/000000_0:0+67108864,/home/hadoop/user/hive/warehouse/bttn_bkp/000000_0:67108864+67108864,/home/hadoop/user/hive/warehouse/bttn_bkp/000000_0:134217728+38487253
> Locations:NHCLT-PC44-2:;
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:49 INFO mapred.JobClient: Running job:
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > job_201303121648_0018
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:20:50 INFO mapred.JobClient:  map 0% reduce 0%
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:21:06 INFO mapred.JobClient:  map 8% reduce 0%
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:21:09 INFO mapred.JobClient:  map 13% reduce 0%
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:21:12 INFO mapred.JobClient:  map 17% reduce 0%
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:21:15 INFO mapred.JobClient:  map 21% reduce 0%
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:21:18 INFO mapred.JobClient:  map 26% reduce 0%
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:21:21 INFO mapred.JobClient:  map 30% reduce 0%
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:21:24 INFO mapred.JobClient:  map 35% reduce 0%
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:21:27 INFO mapred.JobClient:  map 40% reduce 0%
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:21:30 INFO mapred.JobClient:  map 45% reduce 0%
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:21:33 INFO mapred.JobClient:  map 50% reduce 0%
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:21:36 INFO mapred.JobClient:  map 53% reduce 0%
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:21:39 INFO mapred.JobClient:  map 58% reduce 0%
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:21:42 INFO mapred.JobClient:  map 62% reduce 0%
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:21:45 INFO mapred.JobClient:  map 65% reduce 0%
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > 13/03/13 18:21:47 INFO mapred.JobClient: Task Id :
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > attempt_201303121648_0018_m_000000_0, Status : FAILED
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > java.lang.IllegalArgumentException: Timestamp format must be
> >
> > > > > yyyy-mm-dd hh:mm:ss[.fffffffff]
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at java.sql.Timestamp.valueOf(Timestamp.java:185)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at BTTN_BKP_TEST.__loadFromFields(BTTN_BKP_TEST.java:1331)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at BTTN_BKP_TEST.parse(BTTN_BKP_TEST.java:1148)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at
> >
> > > > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.j
> >
> > > > > ava:77)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at
> >
> > > > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.j
> >
> > > > > ava:36)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >        at
> >
> > > > > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapp
> >
> > > > > er.java:182)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at
> >
> > > > > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at java.security.AccessController.doPrivileged(Native
> >
> > > > > Method)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at javax.security.auth.Subject.doAs(Subject.java:396)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at
> >
> > > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInfo
> >
> > > > > rmation.java:1121)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > ::DISCLAIMER::
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > ------------------------------------------------------------------
> >
> > > > > ----
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > ------------------------------------------------------------------
> >
> > > > > ----
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > --------
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > The contents of this e-mail and any attachment(s) are confidential
> and intended for the named recipient(s) only.
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > E-mail transmission is not guaranteed to be secure or error-free
> >
> > > > > as
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > information could be intercepted, corrupted, lost, destroyed,
> >
> > > > > arrive
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > late or incomplete, or may contain viruses in transmission. The e
> mail and its contents (with or without referred errors) shall therefore not
> attach any liability on the originator or HCL or its affiliates.
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > Views or opinions, if any, presented in this email are solely
> >
> > > > > those of
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > the author and may not necessarily reflect the views or opinions
> >
> > > > > of
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > HCL or its affiliates. Any form of reproduction, dissemination,
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > copying, disclosure, modification, distribution and / or
> publication of this message without the prior written consent of authorized
> representative of HCL is strictly prohibited. If you have received this
> email in error please delete it and notify the sender immediately.
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > Before opening any email and/or attachments, please check them for
> viruses and other defects.
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > >
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > ------------------------------------------------------------------
> >
> > > > > ----
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > ------------------------------------------------------------------
> >
> > > > > ----
> >
> > >
> >
> > > >
> >
> > >
> >
> > > > > --------
> >
> > >
> >
> > >
> >
> > >
> >
> > >
> >
> >
> >
> >
> >
> >
>



-- 
Regards

Venkat

Mime
View raw message