sqoop-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Boglarka Egyed (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SQOOP-3385) Error while connecting to S3 using Scoop
Date Wed, 19 Sep 2018 08:57:00 GMT

    [ https://issues.apache.org/jira/browse/SQOOP-3385?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16620328#comment-16620328
] 

Boglarka Egyed commented on SQOOP-3385:
---------------------------------------

Hi [~coolsm19],

Integration between Sqoop and AWS CLI is not supported.

S3 connector development is currently under development and testing on trunk, please see this
Epic Jira for getting a better picture: SQOOP-3345. Please also note that the s3:// filesystem
is deprecated (see [https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html#S3)|https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html#S3),] and
thus Sqoop will support only the s3a:// filesystem which is under active development and
maintenance.

The currently tested (and supported) way to specify the AWS credentials on latest trunk version
is setting them via the {{fs.s3a.access.key}} and {{fs.s3a.secret.key}} properties like
this:
{noformat}
$ sqoop import -Dfs.s3a.access.key=$AWS_ACCESS_KEY -Dfs.s3a.secret.key=$AWS_SECRET_KEY --connect
$CONN --username $USER --password $PWD --table $TABLENAME --target-dir s3a://example-bucket/target-directory{noformat}
Please let me know if you have any further question.

Kind Regards,
Bogi

> Error while connecting to S3 using Scoop
> ----------------------------------------
>
>                 Key: SQOOP-3385
>                 URL: https://issues.apache.org/jira/browse/SQOOP-3385
>             Project: Sqoop
>          Issue Type: Bug
>          Components: connectors
>    Affects Versions: 1.4.7
>            Reporter: Suchit
>            Priority: Minor
>              Labels: S3
>
> I am facing an issue while trying to import file from On Prem DB to S3 using Sqoop.
> Things I an able to do-
> 1- I am connected to S3 , able to run aws s3 ls & other AWS cli commands
> 2- Able to generate a file connecting to DB to local Unix box.
> But when I change the target directory to S3 instead of locat I am getting below error-
>  
>  "ERROR tool.ImportTool: Import failed: AWS Access Key ID and Secret Access Key must
be specified as the username or password (respectively) of a s3 URL, or by setting the fs.s3.awsAccessKeyId
or fs.s3.awsSecretAccessKey properties (respectively)."
>   
>  Ideally the Sqoop installation should be able to pick up the credentials from credential
file inside .aws directory of the user running the command but  Is there a way I can specify
the credentials?
>   
>  Thanks in advance.
>   



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message