phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Josh Mahonin <>
Subject Re: integration Phoenix and Spark
Date Tue, 29 Sep 2015 19:56:11 GMT
Make sure to double check your imports. Note the following from

import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext
import org.apache.phoenix.spark._

There's also a sample repository here:

From: Hardika Catur Sapta
Reply-To: "<>"
Date: Tuesday, September 29, 2015 at 5:28 AM
To: "<>"
Subject: Re: integration Phoenix and Spark

/spark/Project Spark$ scala SavingPhoenix.scala
/home/hduser/spark/Project Spark/SavingPhoenix.scala:1: error: object spark is not a member
of package
/home/hduser/spark/Project Spark/SavingPhoenix.scala:4: error: not found: type SparkContext
val sc = new SparkContext("local", "phoenix-test")
two errors found

2015-09-29 16:20 GMT+07:00 Konstantinos Kougios <<>>:

Just to add that, at least for hadoop-2.7.1 and phoenix 4.5.2-HBase-1.1, hadoop guava lib
has to be patched to 14.0.1 (under hadoop/share/hadoop/common/lib) otherwise spark tasks might
fail due to missing guava methods.


On 29/09/15 10:17, Hardika Catur Sapta wrote:
Spark setup

  1.  Ensure that all requisite Phoenix / HBase platform dependencies are available on the
classpath for the Spark executors and drivers

  2.  One method is to add the phoenix-4.4.0-client.jar to ‘SPARK_CLASSPATH’ in,
or setting both ‘spark.executor.

  3.  To help your IDE, you may want to add the following ‘provided’ dependency

sorry for bad English.

intent to number 2 and 3 how ??

please explain step by step.


View raw message