spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From chenjie <>
Subject Re: HiveContext is creating metastore warehouse locally instead of in hdfs
Date Fri, 01 Aug 2014 08:39:16 GMT
I used the web ui of spark and could see the conf directory is in CLASSPATH.
An abnormal thing is that when start spark-shell I always get the following
WARN NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable

At first, I think it's because the hadoop version is not compatible with the
pre-built spark. My hadoop version is 2.4.1 and the pre-built spark is built
against hadoop 2.2.0. Then, I built spark from src against hadoop 2.4.1.
However, I still got the info above.

Besides, when I set log4j.rootCategory to DEBUG, I got an exception which
said "HADOOP_HOME or hadoop.home.dir are not set" despite I have set

alee526 wrote
> Could you enable HistoryServer and provide the properties and CLASSPATH
> for the spark-shell? And 'env' command to list your environment variables?
> By the way, what does the spark logs says? Enable debug mode to see what's
> going on in spark-shell when it tries to interact and init HiveContext.

View this message in context:
Sent from the Apache Spark User List mailing list archive at

View raw message