spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: HDFS is undefined
Date Mon, 28 Sep 2015 07:04:07 GMT
For some reason Spark isnt picking up your hadoop confs, Did you download
spark compiled with the hadoop version that you are having in the cluster?

Thanks
Best Regards

On Fri, Sep 25, 2015 at 7:43 PM, Angel Angel <areyouangel90@gmail.com>
wrote:

> hello,
> I am running the spark application.
>
> I have installed the cloudera manager.
> it includes the spark version 1.2.0
>
>
> But now i want to use spark version 1.4.0.
>
> its also working fine.
>
> But when i try to access the HDFS in spark 1.4.0 in eclipse i am getting
> the following error.
>
> "Exception in thread "main" java.nio.file.FileSystemNotFoundException:
> Provider "hdfs" not installed "
>
>
> My spark 1.4.0 spark-env.sh file is
>
> export HADOOP_CONF_DIR=/etc/hadoop/conf
> export SPARK_HOME=/root/spark-1.4.0
>
>
> export
> DEFAULT_HADOOP_HOME=/opt/cloudera/parcels/CDH-5.3.5-1.cdh5.3.5.p0.4/lib/hadoop
>
> still i am getting the error.
>
> please give me suggestions.
>
> Thanking You,
> Sagar Jadhav.
>

Mime
View raw message