sqoop-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Ankush Gupta (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SQOOP-2192) SQOOP IMPORT/EXPORT for the ORC file HIVE TABLE Failing
Date Fri, 19 Jan 2018 06:49:00 GMT

    [ https://issues.apache.org/jira/browse/SQOOP-2192?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16331825#comment-16331825
] 

Ankush Gupta commented on SQOOP-2192:
-------------------------------------

Hi Venkat,

Hope you are doing fine.

Do we have an update on this issue?

I am trying to export the buckted table from Hive to SQL Server. Still getting the same error.

{color:#d04437}org.apache.hive.hcatalog.common.HCatException : 2016 : Error operation not
supported : Store into a partition with bucket definition from Pig/Mapreduce is not supported{color}

{color:#d04437}{color:#333333}Do we have any workaround to transfer the bucketed data from
Hive to SQL server using SQOOP.{color}
{color}

 

> SQOOP IMPORT/EXPORT for the ORC file HIVE TABLE Failing
> -------------------------------------------------------
>
>                 Key: SQOOP-2192
>                 URL: https://issues.apache.org/jira/browse/SQOOP-2192
>             Project: Sqoop
>          Issue Type: Bug
>          Components: hive-integration
>    Affects Versions: 1.4.5
>         Environment: Hadoop 2.6.0
> Hive 1.0.0
> Sqoop 1.4.5
>            Reporter: Sunil Kumar
>            Assignee: Venkat Ranganathan
>            Priority: Major
>
> We are trying to export RDMB table to Hive table for running Hive  delete, update queries
on exported Hive table. Since for the Hive to support delete, update queries on following
is required:
> 1. Needs to declare table as having Transaction Property
> 2. Table must be in ORC format
> 3. Tables must to be bucketed
> to do that i have create hive table using hcat:
> create table bookinfo(md5 STRING , isbn STRING , bookid STRING , booktitle STRING , author
STRING , yearofpub STRING , publisher STRING , imageurls STRING , imageurlm STRING , imageurll
STRING , price DOUBLE , totalrating DOUBLE , totalusers BIGINT , maxrating INT , minrating
INT , avgrating DOUBLE , rawscore DOUBLE , norm_score DOUBLE) clustered by (md5) into 10 buckets
stored as orc TBLPROPERTIES('transactional'='true');
> then running sqoop import:
> sqoop import --verbose --connect 'RDBMS_JDBC_URL' --driver JDBC_DRIVER --table bookinfo
--null-string '\\N' --null-non-string '\\N' --username USER --password PASSWPRD --hcatalog-database
hive_test_trans --hcatalog-table bookinfo --hcatalog-storage-stanza "storedas orc" -m 1
> Following exception is comming:
> 15/03/09 16:28:59 ERROR tool.ImportTool: Encountered IOException running import job:
org.apache.hive.hcatalog.common.HCatException : 2016 : Error operation not supported : Store
into a partition with bucket definition from Pig/Mapreduce is not supported
>         at org.apache.hive.hcatalog.mapreduce.HCatOutputFormat.setOutput(HCatOutputFormat.java:109)
>         at org.apache.hive.hcatalog.mapreduce.HCatOutputFormat.setOutput(HCatOutputFormat.java:70)
>         at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCatUtilities.java:339)
>         at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureImportOutputFormat(SqoopHCatUtilities.java:753)
>         at org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBase.java:98)
>         at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:240)
>         at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:665)
>         at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
>         at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:601)
>         at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
>         at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
> Please let any futher details required.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message