spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Raymond Honderdors <>
Subject RE: Build with Thrift Server & Scala 2.11
Date Tue, 05 Apr 2016 13:22:45 GMT
I can see that the build is successful
(-Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver –Dscala-2.11 -DskipTests
clean package)

the documents page it still says that
Building With Hive and JDBC Support
To enable Hive integration for Spark SQL along with its JDBC server and CLI, add the -Phive
and Phive-thriftserver profiles to your existing build options. By default Spark will build
with Hive 0.13.1 bindings.

# Apache Hadoop 2.4.X with Hive 13 support
mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver -DskipTests clean
Building for Scala 2.11
To produce a Spark package compiled with Scala 2.11, use the -Dscala-2.11 property:

./dev/ 2.11
mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package
Spark does not yet support its JDBC component for Scala 2.11.
Source :

When I try to start the thrift server I get the following error:
16/04/05 16:09:11 INFO BlockManagerMaster: Registered BlockManager
16/04/05 16:09:12 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: namenode
                at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(
                at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(
                at org.apache.hadoop.hdfs.DFSClient.<init>(
                at org.apache.hadoop.hdfs.DFSClient.<init>(
                at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(
                at org.apache.hadoop.fs.FileSystem.createFileSystem(
                at org.apache.hadoop.fs.FileSystem.access$200(
                at org.apache.hadoop.fs.FileSystem$Cache.getInternal(
                at org.apache.hadoop.fs.FileSystem$Cache.get(
                at org.apache.hadoop.fs.FileSystem.get(
                at org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1667)
                at org.apache.spark.scheduler.EventLoggingListener.<init>(EventLoggingListener.scala:67)
                at org.apache.spark.SparkContext.<init>(SparkContext.scala:517)
                at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:57)
                at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:77)
                at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                at sun.reflect.NativeMethodAccessorImpl.invoke(
                at sun.reflect.DelegatingMethodAccessorImpl.invoke(
                at java.lang.reflect.Method.invoke(
                at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:726)
                at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
                at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
                at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
                at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: namenode
                ... 26 more
16/04/05 16:09:12 INFO SparkUI: Stopped Spark web UI at
16/04/05 16:09:12 INFO SparkDeploySchedulerBackend: Shutting down all executors

Raymond Honderdors
Team Lead Analytics BI
Business Intelligence Developer<>
T +972.7325.3569

From: Reynold Xin []
Sent: Tuesday, April 05, 2016 3:57 PM
To: Raymond Honderdors <>
Subject: Re: Build with Thrift Server & Scala 2.11

What do you mean? The Jenkins build for Spark uses 2.11 and also builds the thrift server.

On Tuesday, April 5, 2016, Raymond Honderdors <<>>
Is anyone looking into this one, Build with Thrift Server & Scala 2.11?
I9f so when can we expect it

Raymond Honderdors
Team Lead Analytics BI
Business Intelligence Developer<javascript:_e(%7B%7D,'cvml','');>
T +972.7325.3569

[

View raw message