sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From pratik khadloya <tispra...@gmail.com>
Subject Re: Hive import is not compatible with importing into AVRO format
Date Fri, 12 Sep 2014 02:11:38 GMT
Oh ok, thanks for the information Xu. Can it be invoked using
--as-parquetfile with --hive-import ?

Regards,
Pratik

On Thu, Sep 11, 2014 at 6:17 PM, Xu, Qian A <qian.a.xu@intel.com> wrote:

>  Unfortunately, Avro format is not supported for a Hive import. You can
> fire a JIRA for that. Note that the trunk version of Sqoop1 supports Hive
> import as Parquet.
>
>
>
> --
>
> Qian Xu (Stanley)
>
>
>
> *From:* lizhanqiang@inspur.com [mailto:lizhanqiang@inspur.com]
> *Sent:* Friday, September 12, 2014 8:56 AM
> *To:* user@sqoop.apache.org
> *Subject:* Re: Hive import is not compatible with importing into AVRO
> format
>
>
>
>
>
> Hey,there:
>
>  Does hive support the format of avroFile.As I know it just supoort
> rcfile,textfile,sequencefile.Hope this helpful to you.
>
>
>
> *From:* pratik khadloya <tispratik@gmail.com>
>
> *Date:* 2014-09-12 08:26
>
> *To:* user@sqoop.apache.org
>
> *Subject:* Hive import is not compatible with importing into AVRO format
>
> I am trying to import data from a free form mysql query into hive. I need
> the files to be as AVRO data files, but when i pass the --as-avrodatafile
> option, i get a compatibility error. Is there a way i can tell sqoop to use
> the avro file format?
>
>
>
> $ bin/sqoop import -jt <jobtracker> --connect jdbc:mysql://<mydbserver>*/*<mydb>
--username
> <dbuser> --password <dbpwd> --target-dir /user/pkhadloya/sqoop/mytable
> --query “<my query> WHERE \$CONDITIONS" --num-mappers 1 --hive-import
> --hive-table mytable --create-hive-table --as-avrodatafile
>
>
>
>
>
> ~Pratik
>
>

Mime
View raw message