spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From RajG <rjk...@gmail.com>
Subject Retrieving Spark Configuration properties
Date Fri, 17 Jul 2015 03:53:11 GMT
I am using this version of Spark : *spark-1.4.0-bin-hadoop2.6* . I want to
check few default properties. So I gave the following statement in
spark-shell

*scala> sqlContext.getConf("spark.sql.hive.metastore.version")
*I was expecting the call to method getConf to return a value of 0.13.1 as
desribed in this  link
<http://spark.apache.org/docs/latest/sql-programming-guide.html#interacting-with-different-versions-of-hive-metastore>

. But I got the below exception

*java.util.NoSuchElementException: spark.sql.hive.metastore.version
    at
org.apache.spark.sql.SQLConf$$anonfun$getConf$1.apply(SQLConf.scala:283)
    at
org.apache.spark.sql.SQLConf$$anonfun$getConf$1.apply(SQLConf.scala:283)
*Am I retrieving the properties in the right way?




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Retrieving-Spark-Configuration-properties-tp23881.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message