spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mich Talebzadeh <mich.talebza...@gmail.com>
Subject issue accessing Phoenix table from Spark
Date Fri, 07 Oct 2016 08:27:53 GMT
Hi,

my code is trying to load a phoenix table built on an Hbase table.

import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.HColumnDescriptor
import org.apache.hadoop.hbase.HTableDescriptor
import org.apache.hadoop.hbase.{ HBaseConfiguration, HColumnDescriptor,
HTableDescriptor }
import org.apache.hadoop.mapred.JobConf
import org.apache.hadoop.hbase.client.HBaseAdmin
import org.apache.spark.sql.types._
import org.apache.phoenix.spark._


The code line is from https://phoenix.apache.org/phoenix_spark.html

scala> val HiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
warning: there was one deprecation warning; re-run with -deprecation for
details
HiveContext: org.apache.spark.sql.hive.HiveContext =
org.apache.spark.sql.hive.HiveContext@533e8807
scala> val df = HiveContext.load(
     | "org.apache.phoenix.spark",
     | Map("table" -> "temptable", "zkUrl" -> "rhes564:2181")
     |   )

warning: there was one deprecation warning; re-run with -deprecation for
details
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
  at
org.apache.phoenix.spark.PhoenixRDD.getPhoenixConfiguration(PhoenixRDD.scala:71)
  at
org.apache.phoenix.spark.PhoenixRDD.phoenixConf$lzycompute(PhoenixRDD.scala:39)
  at org.apache.phoenix.spark.PhoenixRDD.phoenixConf(PhoenixRDD.scala:38)
  at org.apache.phoenix.spark.PhoenixRDD.<init>(PhoenixRDD.scala:42)
  at
org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:50)
  at
org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(LogicalRelation.scala:40)
  at
org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:382)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:143)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:122)
  at org.apache.spark.sql.SQLContext.load(SQLContext.scala:958)
  ... 55 elided
I tried this as a fat jar file building it with Maven but I still get the
same error.

Also the original code looks like this

val df = sqlContext.load(
  "org.apache.phoenix.spark",
  Map("table" -> "TABLE1", "zkUrl" -> "phoenix-server:2181")
)

Thanks








Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.

Mime
View raw message