sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sowjanya Kakarala <sowja...@agrible.com>
Subject Importing data to bucketed table in Hive
Date Wed, 28 Mar 2018 17:48:41 GMT
Hi Guys,

I am trying to import the data from my db to Hive using Sqoop.

My usage is to get the data to Hive and later if necessary update the records from Hive. For
ACID transactions I learnt that I definitely need bucketing tables. 

Now I created bucketed table and I am trying to import the data using Sqoop import where it
is throwing a error as follows:

ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hive.hcatalog.common.HCatException
: 2016 : Error operation not supported : Store into a partition with bucket definition from
Pig/Mapreduce is not supported 

If it is not supported is there a work around?

I created a partitioned table and import data into that table, can I insert these data from
partitioned table to created bucketing table? If yes can someone provide a best example?

I have tried with `insert into` statement for each partition which takes so long.
Because my each year contains “241972735” records.

Also, this might be not a good practice for production environment.

Appreciate your help.

Thanks
Sowjanya
Mime
View raw message