Hello KK,

When importing DATE/TIMESTAMP data with --as-parquetfile or --as-avrodatafile I believe you will need to use either a Long (default) or a String (e.g. --map-column-java my_timestamp=String)


Markus Kemper
Customer Operations Engineer
 www.cloudera.com


On Wed, Dec 21, 2016 at 11:54 PM, kishore kumar <kishore.alaj@gmail.com> wrote:
Thanks for the quick response Szabolcs/Markus,

Our use case is to import data from sqlserver to hive table as parquet format, In hive table we have Timestamp columns,

In our sqoop import job we are able to load data from DATETIME to TIMESTAMP using --map-cloumn-hive option in text format, 

But --as-parquetfile giving "ERROR tool.ImportTool: Imported Failed: Cannot convert unsupported type: timestamp"

for this reason we gave a try with hcatalog, We hit “Error: java.lang.RuntimeException: Should never be used”,

Please suggest me if any other alternate will work. 

Regards,
KK.

On Wed, Dec 21, 2016 at 9:19 PM, Markus Kemper <markus@cloudera.com> wrote:
Hello Kishore,

This issue should not be RDBMS vendor specific, please see SQOOP-3046 as a possible match for your issue.  My understanding of this issue is that there might be some KiteSDK issues like Szabi pointed out however there are also likely some Hive issues as well.


Markus Kemper
Customer Operations Engineer
 www.cloudera.com


On Wed, Dec 21, 2016 at 10:17 AM, Szabolcs Vasas <vasas@cloudera.com> wrote:
Hi Kishore,

Sqoop does not support importing in Parquet format using HCatalog currently most probably because Parquet import is implemented using the Kite SDK. Do you really need HCatalog in your use case?

Regards,
Szabolcs

On Wed, Dec 21, 2016 at 1:41 PM, kishore kumar <kishore.alaj@gmail.com> wrote:
Hi Experts,

Using hcatalog, Could we write data from sqlserver to hive table as parquet file ?

Thanks,
KK.



--
Szabolcs Vasas
Software Engineer