spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marcelo Vanzin <van...@cloudera.com>
Subject Re: Backporting spark 1.1.0 to CDH 5.1.3
Date Mon, 10 Nov 2014 19:52:01 GMT
Hello,

CDH 5.1.3 ships with a version of Hive that's not entirely the same as
the Hive Spark 1.1 supports. So when building your custom Spark, you
should make sure you change all the dependency versions to point to
the CDH versions.

IIRC Spark depends on org.spark-project.hive:0.12.0, you'd have to
change it to something like org.apache.hive:0.12.0-cdh5.1.3. And
you'll probably run into compilation errors at that point (you can
check out cloudera's public repo for the patches needed to make Spark
1.0 compile against CDH's Hive 0.12 in [1]).

If you're still willing to go forward at this point, feel free to ask
questions, although CDH-specific questions would probably be better
asked on our mailing list instead (cdh-users@cloudera.org).

[1] https://github.com/cloudera/spark/commits/cdh5-1.0.0_5.1.0

On Mon, Nov 10, 2014 at 3:58 AM, Zalzberg, Idan (Agoda)
<Idan.Zalzberg@agoda.com> wrote:
> Hello,
>
> I have a big cluster running CDH 5.1.3 which I can’t upgrade to 5.2.0 at the
> current time.
>
> I would like to run Spark-On-Yarn in that cluster.
>
>
>
> I tried to compile spark with CDH-5.1.3 and I got HDFS to work but I am
> having problems with the connection to hive:
>
>
>
> java.sql.SQLException: Could not establish connection to
> jdbc:hive2://localhost.localdomain:10000/: Required field
> 'serverProtocolVersion' is unset!
> Struct:TOpenSessionResp(status:TStatus(statusCode:SUCCESS_STATUS),
> serverProtocolVersion:null,
> sessionHandle:TSessionHandle(sessionId:THandleIdentifier(guid:C7 86 85 3D 38
> 91 41 A1 AF 02 83 DA 80 74 A5 B1, secret:62 80 00
>
> 99 D6 73 48 9B 81 13 FB D7 DB 32 32 26)), configuration:{})
>
> [info]   at
> org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:246)
>
> [info]   at
> org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:132)
>
> [info]   at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
>
> [info]   at java.sql.DriverManager.getConnection(DriverManager.java:571)
>
> [info]   at java.sql.DriverManager.getConnection(DriverManager.java:215)
>
> [info]   at
> com.agoda.mse.hadooputils.HiveTools$.getHiveConnection(HiveTools.scala:135)
>
> [info]   at
> com.agoda.mse.hadooputils.HiveTools$.withConnection(HiveTools.scala:19)
>
> [info]   at
> com.agoda.mse.hadooputils.HiveTools$.withStatement(HiveTools.scala:30)
>
> [info]   at
> com.agoda.mse.hadooputils.HiveTools$.copyFileToHdfsThenRunQuery(HiveTools.scala:110)
>
> [info]   at
> SparkAssemblyTest$$anonfun$4.apply$mcV$sp(SparkAssemblyTest.scala:41)
>
>
>
> This happens when I try to create a hive connection myself, using the
> hive-jdbc-cdh5.1.3 package ( I can connect if I don’t have the spark in the
> classpath)
>
>
>
> How can I get spark jar to be consistent with hive-jdbc for CDH5.1.3?
>
> Thanks
>
>
> ________________________________
>
> This message is confidential and is for the sole use of the intended
> recipient(s). It may also be privileged or otherwise protected by copyright
> or other legal rules. If you have received it by mistake please let us know
> by reply email and delete it from your system. It is prohibited to copy this
> message or disclose its content to anyone. Any confidentiality or privilege
> is not waived or lost by any mistaken delivery or unauthorized disclosure of
> the message. All messages sent to and from Agoda may be monitored to ensure
> compliance with company policies, to protect the company's interests and to
> remove potential malware. Electronic messages may be intercepted, amended,
> lost or deleted, or contain viruses.



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message