spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From KhajaAsmath Mohammed <>
Subject S3 Access Issues - Spark
Date Tue, 18 May 2021 23:11:31 GMT

I have written a sample spark job that reads the data residing in hbase. I
keep getting below error , any suggestions to resolve this please?

Caused by: java.lang.IllegalArgumentException: AWS Access Key ID and Secret
Access Key must be specified by setting the fs.s3.awsAccessKeyId and
fs.s3.awsSecretAccessKey properties (respectively).
at org.apache.hadoop.fs.s3.S3Credentials.initialize(

      conf.set("fs.s3.impl", "org.apache.hadoop.fs.s3.S3FileSystem")
      conf.set("fs.s3.awsAccessKeyId", "ddd")
      conf.set("fs.s3.awsSecretAccessKey", "dddddd")

      conf.set("fs.s3n.awsAccessKeyId", "xxxxxxx")
      conf.set("fs.s3n.awsSecretAccessKey", "xxxx")

 I tried this setting in spark config and hbase config but none of the
resolved my issue.


View raw message