spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Adnan <>
Subject Re: Executing spark jobs with predefined Hadoop user
Date Thu, 10 Apr 2014 13:40:25 GMT
Then problem is not on spark side, you have three options, choose any one of

1. Change permissions on /tmp/Iris folder from shell on NameNode with "hdfs
dfs -chmod" command.
2. Run your hadoop service with hdfs user.
3. Disable dfs.permissions in conf/hdfs-site.xml.


avito wrote
> Thanks Adam for the quick answer. You are absolutely right. 
> We are indeed using the entire HDFS URI. Just for the post I have removed
> the name node details.

View this message in context:
Sent from the Apache Spark User List mailing list archive at

View raw message