Hi Tom,

In Spark 1.4, we have de-coupled the support of Hive's metastore and other parts (parser, Hive udfs, and Hive SerDes). The execution engine of Spark SQL in 1.4 will always use Hive 0.13.1. For the metastore connection part, you can connect to either Hive 0.12 or 0.13.1's metastore. We have removed old shims and profiles of specifying the Hive version (since execution engine is always using Hive 0.13.1 and metastore client part can be configured to use either Hive 0.12 or 0.13.1's metastore).

You can take a look at https://spark.apache.org/docs/latest/sql-programming-guide.html#interacting-with-different-versions-of-hive-metastore for connecting to Hive 0.12's metastore. 

Let me know if you have any question.

Thanks,

Yin

On Wed, Jun 17, 2015 at 4:18 PM, Thomas Dudziak <tomdzk@gmail.com> wrote:
So I'm a little confused, has Hive 0.12 support disappeared in 1.4.0 ? The release notes didn't mention anything, but the documentation doesn't list a way to build for 0.12 anymore (http://spark.apache.org/docs/latest/building-spark.html#building-with-hive-and-jdbc-support, in fact it doesn't list anything other than 0.13), and I don't see any maven profiles nor code for 0.12.

Tom