sqoop-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "yuxiaoyong (JIRA)" <j...@apache.org>
Subject [jira] [Comment Edited] (SQOOP-423) Sqoop import of timestamps to Avro from Postgres - Timezone Issue
Date Thu, 07 Mar 2019 08:06:00 GMT

    [ https://issues.apache.org/jira/browse/SQOOP-423?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16786487#comment-16786487
] 

yuxiaoyong edited comment on SQOOP-423 at 3/7/19 8:05 AM:
----------------------------------------------------------

We have encountered the same issue, the TIMESTAMP values are 8 hours ahead and our format
is textfile !!

Instead of use  
 * --table TABLE_NAME ,
 * -Dmapred.child.java.opts="-Duser.timezone=Asia/Shanghai"
 * -Dmapreduce.map.java.opts=' -Duser.timezone=Asia/Shanghai'

we use --query " select * from tablename \$CONDITIONS " to avoid the automatically transformation
from sqoop, and the TIMESTAMP keeps same as the MYSQL DB.

We don't known why it happens, because java.sql._Timestamp has nothing to do with TIMEZONE.
Hope guys to fix this bug_


was (Author: aeoles):
We have encountered the same issue, the TIMESTAMP values are 8 hours ahead and our format
is textfile !!

Instead of use  --table TABLE_NAME ,we use --query " " to avoid the automatically transformation
from sqoop, and the TIMESTAMP keeps same as the MYSQL DB.

We don't known why it happens, because java.sql._Timestamp has nothing to do with TIMEZONE.
Hope guys to fix this bug_

> Sqoop import of timestamps to Avro from Postgres - Timezone Issue
> -----------------------------------------------------------------
>
>                 Key: SQOOP-423
>                 URL: https://issues.apache.org/jira/browse/SQOOP-423
>             Project: Sqoop
>          Issue Type: Bug
>    Affects Versions: 1.3.0
>            Reporter: Lynn Goh
>            Priority: Major
>
> I am running sqoop-1.3.0-cdh3u2 on a Mac and when I sqoop import from a postgres table
with columns of type 'timestamp without time zone', they are converted to longs in the time
zone of my local operating system, even after I have started Hadoop up with TZ=GMT or passed
in HADOOP_OPTS="-Duser.timezone=GMT".  My ultimate goal is to sqoop import into long representations
that are in GMT timezone rather than my operating system's timezone.
> Postgres example:
> {code}
> acamp_id |     start_time      |      end_time       
> ----------+---------------------+---------------------
>        1 | 2008-01-01 00:00:00 | 2011-12-16 00:00:00
> {code}
> After import, you can see the values are 8 hours ahead, even with TZ=GMT and user.timezone
set properly (this is the json representation of the parsed imported avro file):
> {code}
> {"acamp_id": 1, "end_time": 1324022400000, "start_time": 1199174400000}
> {code}
> date utility invocation:
> {code}
> lynngoh@unknown:~$ date -u -r 1199174400
> Tue Jan  1 08:00:00 UTC 2008
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message