sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mark Roddy <markro...@gmail.com>
Subject Error Doing Direct Import from MySql
Date Thu, 03 Apr 2014 16:07:33 GMT
I am trying to do a local (without a cluster) import from a database to
s3.  This works as expected, until I use the --direct option.   In the logs
below it would appear that mysqldump does not like the code that was
generated for some reason.  Any help would be appreciated.

-Mark


Logs:

14/04/03 15:37:53 INFO mapreduce.MySQLDumpMapper: Beginning mysqldump fast
path import
14/04/03 15:37:53 INFO mapreduce.MySQLDumpMapper: Performing import of
table some_tweets from database tweetdb
14/04/03 15:37:53 INFO mapreduce.MySQLDumpMapper: Converting data to use
specified delimiters.
14/04/03 15:37:53 INFO mapreduce.MySQLDumpMapper: (For the fastest possible
import, use
14/04/03 15:37:53 INFO mapreduce.MySQLDumpMapper: --mysql-delimiters to
specify the same field
14/04/03 15:37:53 INFO mapreduce.MySQLDumpMapper: delimiters as are used by
mysqldump.)
14/04/03 15:37:53 INFO mapreduce.MySQLDumpMapper: mysqldump: Couldn't
execute 'SET OPTION SQL_QUOTE_SHOW_CREATE=1/*!40102
,SQL_MODE=concat(@@sql_mode, _utf8
',NO_KEY_OPTIONS,NO_TABLE_OPTIONS,NO_FIELD_OPTIONS') */': You have an error
in your SQL syntax; check the manual that corresponds to your MySQL server
version for the right syntax to use near 'OPTION
SQL_QUOTE_SHOW_CREATE=1/*!40102 ,SQL_MODE=concat(@@sql_mode, _utf8 ',NO_K'
at line 1 (1064)
14/04/03 15:37:53 INFO mapreduce.MySQLDumpMapper: Transfer loop complete.
14/04/03 15:37:53 WARN mapred.LocalJobRunner: job_local_0001
java.io.IOException: mysqldump terminated with status 2
        at
org.apache.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:485)
        at
org.apache.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:49)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
        at
org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212)



Full Bash Script (sensitive data omitted):
SQOOP_OPTS="-D fs.s3n.awsAccessKeyId=XXX -D fs.s3n.awsSecretAccessKey=XXX
-D fs.s3.awsAccessKeyId=XXX -D fs.s3.awsSecretAccessKey=XXX"
OPTARGS=""
OPTARGS="$OPTARGS --username USERNAME"
OPTARGS="$OPTARGS --password PASSWORD"
OPTARGS="$OPTARGS --direct"
sqoop \
    import \
    $SQOOP_OPTS \
    -m 1 \
    --table some_tweets \
    --connect jdbc:mysql://myhost/mydb \
    --target-dir s3n://mybucket/somefolder \
    $OPTARGS \



Software Versions:
OS: Ubuntu 12.04
Sqoop Version: 1.4.4
MySql Dump Version: Ver 10.13 Distrib 5.1.69 (from Ubuntu deb package)

Mime
View raw message