spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ayan guha <guha.a...@gmail.com>
Subject Re: Hive to Oracle using Spark - Type(Date) conversion issue
Date Mon, 19 Mar 2018 03:04:55 GMT
Hi

The is not with spark in this case, it is with Oracle. If you do not know
which columns to apply date-related conversion rule, then you have a
problem.

You should try either

a) Define some config file where you can define table name, date column
name and date-format @ source  so that you can apply appropriate conversion
dynamically
b) Write data into Oracle DB with String data type but have a view which
will translate the date
c) Define Hive tables with date data type so that you can apply appropriate
conversion



On Mon, Mar 19, 2018 at 1:36 PM, Deepak Sharma <deepakmca05@gmail.com>
wrote:

> The other approach would to write to temp table and then merge the data.
> But this may be expensive solution.
>
> Thanks
> Deepak
>
> On Mon, Mar 19, 2018, 08:04 Gurusamy Thirupathy <thiruguru@gmail.com>
> wrote:
>
>> Hi,
>>
>> I am trying to read data from Hive as DataFrame, then trying to write the
>> DF into the Oracle data base. In this case, the date field/column in hive
>> is with Type Varchar(20)
>> but the corresponding column type in Oracle is Date. While reading from
>> hive , the hive table names are dynamically decided(read from another
>> table) based on some job condition(ex. Job1). There are multiple tables
>> like this, so column and the table names are decided only run time. So I
>> can't do type conversion explicitly when read from Hive.
>>
>> So is there any utility/api available in Spark to achieve this conversion
>> issue?
>>
>>
>> Thanks,
>> Guru
>>
>


-- 
Best Regards,
Ayan Guha

Mime
View raw message