Thanks Venkat. Would importing the table as a hcat table instead of a hive table automatically put it in hive?

~Pratik

On Fri, Sep 12, 2014 at 10:22 AM, Venkat Ranganathan <vranganathan@hortonworks.com> wrote:
Generally, you should be able to use any storage format that hive supports with hcatalog import or export  (of course some formats may not work if they don't support the hcatalog used hive serde methods like parquet for example - but you can directly import to parquet with --as-parquetfile

Instead of --hive-import and --hive-table, just use --hcatalog-table <hivetablename>

Venkat

On Fri, Sep 12, 2014 at 10:12 AM, pratik khadloya <tispratik@gmail.com> wrote:
Do we need HCAT_HOME if i am only importing to hive? I don't think i have hcatalog installed.

~Pratik

On Thu, Sep 11, 2014 at 7:16 PM, Xu, Qian A <qian.a.xu@intel.com> wrote:

Yes. Simply replace `--as-avrodatafile` with `--as-parquetfile`.

 

Please make sure the environment variables HIVE_HOME and HCAT_HOME are set correctly.

 

--

Qian Xu (Stanley)

 

From: pratik khadloya [mailto:tispratik@gmail.com]
Sent: Friday, September 12, 2014 10:12 AM
To: user@sqoop.apache.org
Subject: Re: Hive import is not compatible with importing into AVRO format

 

Oh ok, thanks for the information Xu. Can it be invoked using --as-parquetfile with --hive-import ?

 

Regards,

Pratik

 

On Thu, Sep 11, 2014 at 6:17 PM, Xu, Qian A <qian.a.xu@intel.com> wrote:

Unfortunately, Avro format is not supported for a Hive import. You can fire a JIRA for that. Note that the trunk version of Sqoop1 supports Hive import as Parquet.

 

--

Qian Xu (Stanley)

 

From: lizhanqiang@inspur.com [mailto:lizhanqiang@inspur.com]
Sent: Friday, September 12, 2014 8:56 AM
To: user@sqoop.apache.org
Subject: Re: Hive import is not compatible with importing into AVRO format

 

 

Hey,there:

 Does hive support the format of avroFile.As I know it just supoort rcfile,textfile,sequencefile.Hope this helpful to you.

 

Date: 2014-09-12 08:26

Subject: Hive import is not compatible with importing into AVRO format

I am trying to import data from a free form mysql query into hive. I need the files to be as AVRO data files, but when i pass the --as-avrodatafile option, i get a compatibility error. Is there a way i can tell sqoop to use the avro file format?

 

$ bin/sqoop import -jt <jobtracker> --connect jdbc:mysql://<mydbserver>/<mydb> --username <dbuser> --password <dbpwd> --target-dir /user/pkhadloya/sqoop/mytable --query “<my query> WHERE \$CONDITIONS" --num-mappers 1 --hive-import --hive-table mytable --create-hive-table --as-avrodatafile

 

 

~Pratik

 




CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.