spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Amit Hora <>
Subject RE: Unable to Access files in Hadoop HA enabled from using Spark
Date Wed, 13 Apr 2016 06:31:45 GMT
Finally I tried setting the configuration manually using

And it worked ,don't know why it was not reading these settings from file under HADOOP_CONF_DIR

-----Original Message-----
From: "Amit Hora" <>
Sent: ‎4/‎13/‎2016 11:41 AM
To: "Jörn Franke" <>
Cc: "" <>
Subject: RE: Unable to Access files in Hadoop HA enabled from using Spark

There are DNS entries for both of my namenode
Ambarimaster is standby and it resolves to ip perfectly
Hdp231 is active and it also resolves to ip
Hdpha is my Hadoop HA cluster name
And hdfs-site.xml has entries related to these configuration

From: Jörn Franke
Sent: ‎4/‎13/‎2016 11:37 AM
To: Amit Singh Hora
Subject: Re: Unable to Access files in Hadoop HA enabled from using Spark

Is the host in /etc/hosts ?

> On 13 Apr 2016, at 07:28, Amit Singh Hora <> wrote:
> I am trying to access directory in Hadoop from my Spark code on local
> machine.Hadoop is HA enabled .
> val conf = new SparkConf().setAppName("LDA Sample").setMaster("local[2]")
> val sc=new SparkContext(conf)
> val distFile = sc.textFile("hdfs://hdpha/mini_newsgroups/")
> println(distFile.count())
> but getting error
> hdpha
> As hdpha not resolves to a particular machine it is the name I have chosen
> for my HA Hadoop.I have already copied all hadoop configuration on my local
> machine and have set the env. variable HADOOP_CONF_DIR But still no success.
> Any suggestion will be of a great help
> Note:- Hadoop HA is working properly as i have tried uploading file to
> hadoop and it works
> --
> View this message in context:
> Sent from the Apache Spark User List mailing list archive at
> ---------------------------------------------------------------------
> To unsubscribe, e-mail:
> For additional commands, e-mail:
View raw message