sqoop-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "ryan radtke (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SQOOP-2858) Sqoop export with Avro data using (--update-key <key> and --update-mode allowinsert) fails
Date Tue, 02 May 2017 16:08:06 GMT

    [ https://issues.apache.org/jira/browse/SQOOP-2858?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15993164#comment-15993164
] 

ryan radtke commented on SQOOP-2858:
------------------------------------

This bug is specifically related to Avro.   Im having a similar issue with sqoop 1.4.6, orc
table using hcatalog, exporting to oracle, using --update-mode allowinserts and --update-key
record_id.   Everything works fine till I try to add the updates.   Then I get nullpointerexception
hell.  This seems to be a pretty sparsely documented bug for spark.  Anyone have any tips?

> Sqoop export with Avro data using (--update-key <key> and --update-mode allowinsert)
fails
> ------------------------------------------------------------------------------------------
>
>                 Key: SQOOP-2858
>                 URL: https://issues.apache.org/jira/browse/SQOOP-2858
>             Project: Sqoop
>          Issue Type: Bug
>            Reporter: Markus Kemper
>            Assignee: Jarek Jarcec Cecho
>             Fix For: 1.4.7
>
>         Attachments: SQOOP-2858.patch, SQOOP-2858_TestCase.txt
>
>
> Summary:
> 1. sqoop export (--export-dir <avro_data>) fails with error [1]
> 2. sqoop export (--export-dir <avro_data> --update-key --update-mode allowinsert)
fails with error [2]
> 3. sqoop export (--hcatalog-database --hcatalog-table <avro_table> --update-key
--update-mode allowinsert) with error [3]
> See attachment for full test cases.
> [1]
> sqoop export --connect $MYCONN --username $MYUSER --password $MYPSWD --table T2_EXPORT
--export-dir /user/root/t1_avro --num-mappers 1
> 16/02/24 13:29:51 INFO mapreduce.Job: Task Id : attempt_1456318803987_0015_m_000000_0,
Status : FAILED
> Error: java.lang.ClassCastException: java.lang.Integer cannot be cast to java.math.BigDecimal
> 	at T2_EXPORT.setField(T2_EXPORT.java:288)
> [2]
> sqoop export --connect $MYCONN --username $MYUSER --password $MYPSWD --table T2_EXPORT
--export-dir /user/root/t1_avro --num-mappers 1 --map-column-java C1_INT=Integer --update-key
C1_INT --update-mode allowinsert
> 16/02/24 13:58:29 INFO mapreduce.Job: Task Id : attempt_1456318803987_0022_m_000000_0,
Status : FAILED
> Error: java.io.IOException: Can't export data, please check failed map task logs
> 	at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
> 	at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
> 	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
> 	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> 	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:415)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1707)
> 	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> Caused by: java.lang.RuntimeException: Can't parse input data: 'Objavro.schema�{"type":"record"'
> [3]
> sqoop export --connect $MYCONN --username $MYUSER --password $MYPSWD --table T2_EXPORT
--hcatalog-database db1 --hcatalog-table t1_avro --num-mappers 1 --update-key C1_INT --update-mode
allowinsert
> 16/02/24 13:35:06 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.NullPointerException
> java.lang.NullPointerException
> 	at org.apache.sqoop.mapreduce.ExportJobBase.getFileType(ExportJobBase.java:127)
> 	at org.apache.sqoop.mapreduce.ExportJobBase.isSequenceFiles(ExportJobBase.java:118)
> 	at org.apache.sqoop.mapreduce.ExportJobBase.inputIsSequenceFiles(ExportJobBase.java:492)
> 	at org.apache.sqoop.mapreduce.JdbcUpdateExportJob.getMapperClass(JdbcUpdateExportJob.java:69)
> 	at org.apache.sqoop.mapreduce.ExportJobBase.configureMapper(ExportJobBase.java:268)
> 	at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:426)
> 	at org.apache.sqoop.manager.OracleManager.upsertTable(OracleManager.java:467)
> 	at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:74)
> 	at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
> 	at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
> 	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> 	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
> 	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
> 	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
> 	at org.apache.sqoop.Sqoop.main(Sqoop.java:236)



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Mime
View raw message