Thanks for the suggestion Steve. I'll try that out.

Read the long story last night while struggling with this :). I made sure that I don't have any '/' in my key.

On Saturday, May 16, 2015, Steve Loughran <stevel@hortonworks.com> wrote:

> On 15 May 2015, at 21:20, Mohammad Tariq <dontariq@gmail.com> wrote:
>
> Thank you Ayan and Ted for the prompt response. It isn't working with s3n either.
>
> And I am able to download the file. In fact I am able to read the same file using s3 API without any issue.
>


sounds like an S3n config problem. Check your configurations - you can test locally via the hdfs dfs command without even starting spark

 Oh, and if there is a "/" in your secret key, you're going to to need to generate new one. Long story


--