spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mich Talebzadeh <>
Subject Issue with compiling Scala with Spark 2
Date Sun, 14 Aug 2016 15:58:23 GMT

In Spark 2 I am using sbt or mvn to compile my scala program. This used to
compile and run perfectly with Spark 1.6.1 but now it is throwing error

I believe the problem is here. I have

name := "scala"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.0.0"
libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.5.1"

However the error I am getting is

[error] bad symbolic reference. A signature in HiveContext.class refers to
type Logging
[error] in package org.apache.spark which is not available.
[error] It may be completely missing from the current classpath, or the
version on
[error] the classpath might be incompatible with the version used when
compiling HiveContext.class.
[error] one error found
[error] (compile:compileIncremental) Compilation failed

And this is the code

import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.spark.sql.Row
import org.apache.spark.sql.hive.HiveContext
import org.apache.spark.sql.types._
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.functions._
object ETL_scratchpad_dummy {
  def main(args: Array[String]) {
  val conf = new SparkConf().
               set("spark.driver.allowMultipleContexts", "true").
  val sc = new SparkContext(conf)
  //import sqlContext.implicits._
  val HiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
  HiveContext.sql("use oraclehadoop")

Anyone has come across this

Dr Mich Talebzadeh

LinkedIn *

*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.

View raw message