spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Cheng Lian <lian.cs....@gmail.com>
Subject Re: Spark 1.3.1 SparkSQL metastore exceptions
Date Tue, 09 Jun 2015 15:17:33 GMT
Seems that you're using a DB2 Hive metastore? I'm not sure whether Hive 
0.12.0 officially supports DB2, but probably not? (Since I didn't find 
DB2 scripts under the metastore/scripts/upgrade folder in Hive source tree.)

Cheng

On 6/9/15 8:28 PM, Needham, Guy wrote:
> Hi,
> I’m using Spark 1.3.1 to insert into a Hive 0.12 table from a SparkSQL 
> query. The query is a very simple select from a dummy Hive table used 
> for benchmarking.
> I’m using a create table as statement to do the insert. No matter if I 
> do that or an insert overwrite, I get the same Hive exception, unable 
> to alter table, with some Hive metastore issues.
> The data is inserted into the Hive table as expected, however I get a 
> very long stacktrace. Does anyone know the meaning of the stacktrace 
> and how I can avoid generating it every time I insert into a table?
> scala> hiveContext.sql("create table 
> benchmarking.spark_logins_benchmark as select * from 
> benchmarking.logins_benchmark limit 10")
> org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter table.
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.alterTable(Hive.java:387)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.loadTable(Hive.java:1448)
>         at 
> org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult$lzycompute(InsertIntoHiveTable.scala:235)
>         at 
> org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult(InsertIntoHiveTable.scala:123)
>         at 
> org.apache.spark.sql.hive.execution.InsertIntoHiveTable.execute(InsertIntoHiveTable.scala:255)
>         at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:1099)
>         at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:1099)
>         at 
> org.apache.spark.sql.hive.execution.CreateTableAsSelect.run(CreateTableAsSelect.scala:70)
>         at 
> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:54)
>         at 
> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:54)
>         at 
> org.apache.spark.sql.execution.ExecutedCommand.execute(commands.scala:64)
>         at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:1099)
>         at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:1099)
>         at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:147)
>         at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130)
>         at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
>         at 
> org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:101)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
>         at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
>         at $iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
>         at $iwC$$iwC$$iwC.<init>(<console>:37)
>         at $iwC$$iwC.<init>(<console>:39)
>         at $iwC.<init>(<console>:41)
>         at <init>(<console>:43)
>         at .<init>(<console>:47)
>         at .<clinit>(<console>)
>         at .<init>(<console>:7)
>         at .<clinit>(<console>)
>         at $print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>         at 
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
>         at 
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>         at 
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>         at 
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>         at 
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
>         at 
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
>         at 
> org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
>         at 
> org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
>         at 
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
>         at 
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
>         at 
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: MetaException(message:javax.jdo.JDOException: Exception 
> thrown when executing query
>         at 
> org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:596)
>         at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:252)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.listStorageDescriptorsWithCD(ObjectStore.java:2344)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.removeUnusedColumnDescriptor(ObjectStore.java:2290)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.copyMSD(ObjectStore.java:2258)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.alterTable(ObjectStore.java:2115)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:124)
>         at com.sun.proxy.$Proxy14.alterTable(Unknown Source)
>         at 
> org.apache.hadoop.hive.metastore.HiveAlterHandler.alterTable(HiveAlterHandler.java:200)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.alter_table_with_environment_context(HiveMetaStore.java:2388)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
>         at 
> com.sun.proxy.$Proxy15.alter_table_with_environment_context(Unknown 
> Source)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_table(HiveMetaStoreClient.java:215)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_table(HiveMetaStoreClient.java:210)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>         at com.sun.proxy.$Proxy16.alter_table(Unknown Source)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.alterTable(Hive.java:385)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.loadTable(Hive.java:1448)
>         at 
> org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult$lzycompute(InsertIntoHiveTable.scala:235)
>         at 
> org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult(InsertIntoHiveTable.scala:123)
>         at 
> org.apache.spark.sql.hive.execution.InsertIntoHiveTable.execute(InsertIntoHiveTable.scala:255)
>         at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:1099)
>         at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:1099)
>         at 
> org.apache.spark.sql.hive.execution.CreateTableAsSelect.run(CreateTableAsSelect.scala:70)
>         at 
> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:54)
>         at 
> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:54)
>         at 
> org.apache.spark.sql.execution.ExecutedCommand.execute(commands.scala:64)
>         at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:1099)
>         at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:1099)
>         at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:147)
>         at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130)
>         at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
>         at 
> org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:101)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
>         at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
>         at $iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
>         at $iwC$$iwC$$iwC.<init>(<console>:37)
>         at $iwC$$iwC.<init>(<console>:39)
>         at $iwC.<init>(<console>:41)
>         at <init>(<console>:43)
>         at .<init>(<console>:47)
>         at .<clinit>(<console>)
>         at .<init>(<console>:7)
>         at .<clinit>(<console>)
>         at $print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>         at 
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
>         at 
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>         at 
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>         at 
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>         at 
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
>         at 
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
>         at 
> org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
>         at 
> org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
>         at 
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
>         at 
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
>         at 
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> NestedThrowablesStackTrace:
> com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: 
> SQLCODE=-206, SQLSTATE=42703, SQLERRMC=SUBQ.A0.INPUT_FORMAT, 
> DRIVER=3.67.26
>         at com.ibm.db2.jcc.am.dd.a(dd.java:749)
>         at com.ibm.db2.jcc.am.dd.a(dd.java:66)
>         at com.ibm.db2.jcc.am.dd.a(dd.java:135)
>         at com.ibm.db2.jcc.am.po.c(po.java:2763)
>         at com.ibm.db2.jcc.am.po.d(po.java:2751)
>         at com.ibm.db2.jcc.am.po.a(po.java:2200)
>         at com.ibm.db2.jcc.am.qo.a(qo.java:7384)
>         at com.ibm.db2.jcc.t4.ab.h(ab.java:141)
>         at com.ibm.db2.jcc.t4.ab.b(ab.java:41)
>         at com.ibm.db2.jcc.t4.o.a(o.java:32)
>         at com.ibm.db2.jcc.t4.tb.i(tb.java:145)
>         at com.ibm.db2.jcc.am.po.ib(po.java:2169)
>         at com.ibm.db2.jcc.am.qo.tc(qo.java:3547)
>         at com.ibm.db2.jcc.am.qo.b(qo.java:4345)
>         at com.ibm.db2.jcc.am.qo.gc(qo.java:739)
>         at com.ibm.db2.jcc.am.qo.executeQuery(qo.java:708)
>         at 
> com.jolbox.bonecp.PreparedStatementHandle.executeQuery(PreparedStatementHandle.java:172)
>         at 
> org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeQuery(ParamLoggingPreparedStatement.java:381)
>         at 
> org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:504)
>         at 
> org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:637)
>         at org.datanucleus.store.query.Query.executeQuery(Query.java:1786)
>         at 
> org.datanucleus.store.query.Query.executeWithArray(Query.java:1672)
>         at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:243)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.listStorageDescriptorsWithCD(ObjectStore.java:2344)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.removeUnusedColumnDescriptor(ObjectStore.java:2290)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.copyMSD(ObjectStore.java:2258)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.alterTable(ObjectStore.java:2115)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:124)
>         at com.sun.proxy.$Proxy14.alterTable(Unknown Source)
>         at 
> org.apache.hadoop.hive.metastore.HiveAlterHandler.alterTable(HiveAlterHandler.java:200)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.alter_table_with_environment_context(HiveMetaStore.java:2388)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
>         at 
> com.sun.proxy.$Proxy15.alter_table_with_environment_context(Unknown 
> Source)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_table(HiveMetaStoreClient.java:215)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_table(HiveMetaStoreClient.java:210)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>         at com.sun.proxy.$Proxy16.alter_table(Unknown Source)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.alterTable(Hive.java:385)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.loadTable(Hive.java:1448)
>         at 
> org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult$lzycompute(InsertIntoHiveTable.scala:235)
>         at 
> org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult(InsertIntoHiveTable.scala:123)
>         at 
> org.apache.spark.sql.hive.execution.InsertIntoHiveTable.execute(InsertIntoHiveTable.scala:255)
>         at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:1099)
>         at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:1099)
>         at 
> org.apache.spark.sql.hive.execution.CreateTableAsSelect.run(CreateTableAsSelect.scala:70)
>         at 
> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:54)
>         at 
> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:54)
>         at 
> org.apache.spark.sql.execution.ExecutedCommand.execute(commands.scala:64)
>         at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:1099)
>         at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:1099)
>         at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:147)
>         at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130)
>         at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
>         at 
> org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:101)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
>         at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
>         at $iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
>         at $iwC$$iwC$$iwC.<init>(<console>:37)
>         at $iwC$$iwC.<init>(<console>:39)
>         at $iwC.<init>(<console>:41)
>         at <init>(<console>:43)
>         at .<init>(<console>:47)
>         at .<clinit>(<console>)
>         at .<init>(<console>:7)
>         at .<clinit>(<console>)
>         at $print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>         at 
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
>         at 
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>         at 
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>         at 
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>         at 
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
>         at 
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
>         at 
> org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
>         at 
> org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
>         at 
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
>         at 
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
>         at 
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> )
>         at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:150)
>         at 
> com.sun.proxy.$Proxy15.alter_table_with_environment_context(Unknown 
> Source)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_table(HiveMetaStoreClient.java:215)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_table(HiveMetaStoreClient.java:210)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>         at com.sun.proxy.$Proxy16.alter_table(Unknown Source)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.alterTable(Hive.java:385)
>         ... 62 more
> scala> hiveContext.sql("insert overwrite table 
> benchmarking.spark_logins_benchmark select * from 
> benchmarking.logins_benchmark limit 10")
> org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter table.
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.alterTable(Hive.java:387)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.loadTable(Hive.java:1448)
>         at 
> org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult$lzycompute(InsertIntoHiveTable.scala:235)
>         at 
> org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult(InsertIntoHiveTable.scala:123)
>         at 
> org.apache.spark.sql.hive.execution.InsertIntoHiveTable.execute(InsertIntoHiveTable.scala:255)
>         at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:1099)
>         at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:1099)
>         at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:147)
>         at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130)
>         at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
>         at 
> org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:101)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
>         at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
>         at $iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
>         at $iwC$$iwC$$iwC.<init>(<console>:37)
>         at $iwC$$iwC.<init>(<console>:39)
>         at $iwC.<init>(<console>:41)
>         at <init>(<console>:43)
>         at .<init>(<console>:47)
>         at .<clinit>(<console>)
>         at .<init>(<console>:7)
>         at .<clinit>(<console>)
>         at $print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>         at 
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
>         at 
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>         at 
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>         at 
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>         at 
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
>         at 
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
>         at 
> org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
>         at 
> org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
>         at 
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
>         at 
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
>         at 
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: MetaException(message:javax.jdo.JDOException: Exception 
> thrown when executing query
>         at 
> org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:596)
>         at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:252)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.listStorageDescriptorsWithCD(ObjectStore.java:2344)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.removeUnusedColumnDescriptor(ObjectStore.java:2290)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.copyMSD(ObjectStore.java:2258)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.alterTable(ObjectStore.java:2115)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:124)
>         at com.sun.proxy.$Proxy14.alterTable(Unknown Source)
>         at 
> org.apache.hadoop.hive.metastore.HiveAlterHandler.alterTable(HiveAlterHandler.java:200)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.alter_table_with_environment_context(HiveMetaStore.java:2388)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
>         at 
> com.sun.proxy.$Proxy15.alter_table_with_environment_context(Unknown 
> Source)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_table(HiveMetaStoreClient.java:215)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_table(HiveMetaStoreClient.java:210)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>         at com.sun.proxy.$Proxy16.alter_table(Unknown Source)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.alterTable(Hive.java:385)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.loadTable(Hive.java:1448)
>         at 
> org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult$lzycompute(InsertIntoHiveTable.scala:235)
>         at 
> org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult(InsertIntoHiveTable.scala:123)
>         at 
> org.apache.spark.sql.hive.execution.InsertIntoHiveTable.execute(InsertIntoHiveTable.scala:255)
>         at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:1099)
>         at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:1099)
>         at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:147)
>         at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130)
>         at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
>         at 
> org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:101)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
>         at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
>         at $iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
>         at $iwC$$iwC$$iwC.<init>(<console>:37)
>         at $iwC$$iwC.<init>(<console>:39)
>         at $iwC.<init>(<console>:41)
>         at <init>(<console>:43)
>         at .<init>(<console>:47)
>         at .<clinit>(<console>)
>         at .<init>(<console>:7)
>         at .<clinit>(<console>)
>         at $print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>         at 
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
>         at 
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>         at 
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>         at 
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>         at 
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
>         at 
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
>         at 
> org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
>         at 
> org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
>         at 
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
>         at 
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
>         at 
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> NestedThrowablesStackTrace:
> com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: 
> SQLCODE=-206, SQLSTATE=42703, SQLERRMC=SUBQ.A0.INPUT_FORMAT, 
> DRIVER=3.67.26
>         at com.ibm.db2.jcc.am.dd.a(dd.java:749)
>         at com.ibm.db2.jcc.am.dd.a(dd.java:66)
>         at com.ibm.db2.jcc.am.dd.a(dd.java:135)
>         at com.ibm.db2.jcc.am.po.c(po.java:2763)
>         at com.ibm.db2.jcc.am.po.d(po.java:2751)
>         at com.ibm.db2.jcc.am.po.a(po.java:2200)
>         at com.ibm.db2.jcc.am.qo.a(qo.java:7384)
>         at com.ibm.db2.jcc.t4.ab.h(ab.java:141)
>         at com.ibm.db2.jcc.t4.ab.b(ab.java:41)
>         at com.ibm.db2.jcc.t4.o.a(o.java:32)
>         at com.ibm.db2.jcc.t4.tb.i(tb.java:145)
>         at com.ibm.db2.jcc.am.po.ib(po.java:2169)
>         at com.ibm.db2.jcc.am.qo.tc(qo.java:3547)
>         at com.ibm.db2.jcc.am.qo.b(qo.java:4345)
>         at com.ibm.db2.jcc.am.qo.gc(qo.java:739)
>         at com.ibm.db2.jcc.am.qo.executeQuery(qo.java:708)
>         at 
> com.jolbox.bonecp.PreparedStatementHandle.executeQuery(PreparedStatementHandle.java:172)
>         at 
> org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeQuery(ParamLoggingPreparedStatement.java:381)
>         at 
> org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:504)
>         at 
> org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:637)
>         at org.datanucleus.store.query.Query.executeQuery(Query.java:1786)
>         at 
> org.datanucleus.store.query.Query.executeWithArray(Query.java:1672)
>         at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:243)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.listStorageDescriptorsWithCD(ObjectStore.java:2344)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.removeUnusedColumnDescriptor(ObjectStore.java:2290)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.copyMSD(ObjectStore.java:2258)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.alterTable(ObjectStore.java:2115)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:124)
>         at com.sun.proxy.$Proxy14.alterTable(Unknown Source)
>         at 
> org.apache.hadoop.hive.metastore.HiveAlterHandler.alterTable(HiveAlterHandler.java:200)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.alter_table_with_environment_context(HiveMetaStore.java:2388)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)
>         at 
> com.sun.proxy.$Proxy15.alter_table_with_environment_context(Unknown 
> Source)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_table(HiveMetaStoreClient.java:215)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_table(HiveMetaStoreClient.java:210)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>         at com.sun.proxy.$Proxy16.alter_table(Unknown Source)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.alterTable(Hive.java:385)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.loadTable(Hive.java:1448)
>         at 
> org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult$lzycompute(InsertIntoHiveTable.scala:235)
>         at 
> org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult(InsertIntoHiveTable.scala:123)
>         at 
> org.apache.spark.sql.hive.execution.InsertIntoHiveTable.execute(InsertIntoHiveTable.scala:255)
>         at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:1099)
>         at 
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:1099)
>         at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:147)
>         at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130)
>         at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
>         at 
> org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:101)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
>         at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
>         at $iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
>         at $iwC$$iwC$$iwC.<init>(<console>:37)
>         at $iwC$$iwC.<init>(<console>:39)
>         at $iwC.<init>(<console>:41)
>         at <init>(<console>:43)
>         at .<init>(<console>:47)
>         at .<clinit>(<console>)
>         at .<init>(<console>:7)
>         at .<clinit>(<console>)
>         at $print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>         at 
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
>         at 
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>         at 
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>         at 
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>         at 
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
>         at 
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
>         at 
> org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
>         at 
> org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
>         at 
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
>         at 
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
>         at 
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> )
>         at 
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:150)
>         at 
> com.sun.proxy.$Proxy15.alter_table_with_environment_context(Unknown 
> Source)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_table(HiveMetaStoreClient.java:215)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_table(HiveMetaStoreClient.java:210)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>         at com.sun.proxy.$Proxy16.alter_table(Unknown Source)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.alterTable(Hive.java:385)
>         ... 56 more
> Regards,
> Guy Needham | Data Discovery
> Virgin Media   | Technology and Transformation | Data
> Bartley Wood Business Park, Hook, Hampshire RG27 9UP
> D 01256 75 3362
> I welcome VSRE emails. Learn more at http://vsre.info/
>
>
> --------------------------------------------------------------------
> Save Paper - Do you really need to print this e-mail?
>
> Visit www.virginmedia.com for more information, and more fun.
>
> This email and any attachments are or may be confidential and legally 
> privileged
> and are sent solely for the attention of the addressee(s). If you have 
> received this
> email in error, please delete it from your system: its use, disclosure 
> or copying is
> unauthorised. Statements and opinions expressed in this email may not 
> represent
> those of Virgin Media. Any representations or commitments in this 
> email are
> subject to contract.
>
> Registered office: Media House, Bartley Wood Business Park, Hook, 
> Hampshire, RG27 9UP
> Registered in England and Wales with number 2591237
>


Mime
View raw message