hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Gopal Vijayaraghavan (Jira)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-22442) Datanucleus cannot map MTableColumnStatistics.bitVector when writing to HiveMetastore
Date Fri, 01 Nov 2019 18:04:00 GMT

    [ https://issues.apache.org/jira/browse/HIVE-22442?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16965014#comment-16965014
] 

Gopal Vijayaraghavan commented on HIVE-22442:
---------------------------------------------

Must be the driver bug workaround 

https://github.com/datanucleus/datanucleus-rdbms/blob/master/src/main/java/org/datanucleus/store/rdbms/adapter/MySQLAdapter.java#L221


> Datanucleus cannot map MTableColumnStatistics.bitVector when writing to HiveMetastore

> --------------------------------------------------------------------------------------
>
>                 Key: HIVE-22442
>                 URL: https://issues.apache.org/jira/browse/HIVE-22442
>             Project: Hive
>          Issue Type: Bug
>          Components: Database/Schema, Metastore, SQL, Standalone Metastore
>    Affects Versions: 3.1.2
>            Reporter: Rentao Wu
>            Priority: Major
>
> I'm seeing on insert statements, in StatsTasks as part of persisting MTableColumnStatistics
into the metastore, the new column bitVector (defined to map to BLOB sql type) is failing
to be mapped, causing my insert statements to fail. Any ideas on how to fix this issue?
>  
> I'm using:
> Hive 3.1.2
> Hadoop 3.2.1
> Hive Metastore: Mariadb 5.5.64
> Datanucleus: (default versions defined in pom.xml)
> /usr/lib/hive/lib/datanucleus-api-jdo-4.2.4.jar
>  /usr/lib/hive/lib/datanucleus-core-4.1.17.jar
>  /usr/lib/hive/lib/datanucleus-rdbms-4.1.19.jar
>  
> {noformat}
> 2019-10-29T19:01:11,799 ERROR [aba24ff8-5560-411d-a1ee-141871cd4b4b main([])]: exec.StatsTask
()) - Failed to run stats task
>  org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Failed to generate
new Mapping of type org.datanucleus.store.rdbms.mapping.java.ArrayMapping, exception : JDBC
type BLOB declared for field "org.apache.hadoop.hive.metastore.model.MTableColumnStatistics.bitVector"
of java type java.io.Serializable cant be mapped for this datastore.
>  JDBC type BLOB declared for field "org.apache.hadoop.hive.metastore.model.MTableColumnStatistics.bitVector"
of java type java.io.Serializable cant be mapped for this datastore.
>  org.datanucleus.exceptions.NucleusException: JDBC type BLOB declared for field "org.apache.hadoop.hive.metastore.model.MTableColumnStatistics.bitVector"
of java type java.io.Serializable cant be mapped for this datastore.
>  at org.datanucleus.store.rdbms.mapping.RDBMSMappingManager.getDatastoreMappingClass(RDBMSMappingManager.java:1386)
>  at org.datanucleus.store.rdbms.mapping.RDBMSMappingManager.createDatastoreMapping(RDBMSMappingManager.java:1616)
>  at org.datanucleus.store.rdbms.mapping.java.SingleFieldMapping.prepareDatastoreMapping(SingleFieldMapping.java:59)
>  at org.datanucleus.store.rdbms.mapping.java.AbstractContainerMapping.prepareDatastoreMapping(AbstractContainerMapping.java:99)
>  at org.datanucleus.store.rdbms.mapping.java.SingleFieldMapping.initialize(SingleFieldMapping.java:48)
>  at org.datanucleus.store.rdbms.mapping.java.AbstractContainerMapping.initialize(AbstractContainerMapping.java:67)
>  at org.datanucleus.store.rdbms.mapping.RDBMSMappingManager.getMapping(RDBMSMappingManager.java:482)
>  at org.datanucleus.store.rdbms.table.ClassTable.manageMembers(ClassTable.java:536)
>  at org.datanucleus.store.rdbms.table.ClassTable.manageClass(ClassTable.java:442)
>  at org.datanucleus.store.rdbms.table.ClassTable.initializeForClass(ClassTable.java:1270)
>  at org.datanucleus.store.rdbms.table.ClassTable.initialize(ClassTable.java:276)
>  at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.initializeClassTables(RDBMSStoreManager.java:3279)
>  at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2889)
>  at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:119)
>  at org.datanucleus.store.rdbms.RDBMSStoreManager.manageClasses(RDBMSStoreManager.java:1627)
>  at org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:672)
>  at org.datanucleus.store.rdbms.RDBMSStoreManager.getPropertiesForGenerator(RDBMSStoreManager.java:2088)
>  at org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1271)
>  at org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl.java:3760)
>  at org.datanucleus.state.StateManagerImpl.setIdentity(StateManagerImpl.java:2267)
>  at org.datanucleus.state.StateManagerImpl.initialiseForPersistentNew(StateManagerImpl.java:484)
>  at org.datanucleus.state.StateManagerImpl.initialiseForPersistentNew(StateManagerImpl.java:120)
>  at org.datanucleus.state.ObjectProviderFactoryImpl.newForPersistentNew(ObjectProviderFactoryImpl.java:218)
>  at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2079)
>  at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:1923)
>  at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1778)
>  at org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:217)
>  at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:724)
>  at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:749)
>  at org.apache.hadoop.hive.metastore.ObjectStore.writeMTableColumnStatistics(ObjectStore.java:8152)
>  at org.apache.hadoop.hive.metastore.ObjectStore.updateTableColumnStatistics(ObjectStore.java:8252)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498)
>  at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:97)
>  at com.sun.proxy.$Proxy25.updateTableColumnStatistics(Unknown Source)
>  at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.update_table_column_statistics(HiveMetaStore.java:5759)
>  at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.set_aggr_stats_for(HiveMetaStore.java:7386)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498)
>  at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147)
>  at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:108)
>  at com.sun.proxy.$Proxy34.set_aggr_stats_for(Unknown Source)
>  at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$set_aggr_stats_for.getResult(ThriftHiveMetastore.java:17017)
>  at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$set_aggr_stats_for.getResult(ThriftHiveMetastore.java:17001)
>  at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>  at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:111)
>  at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:107)
>  at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:422)
>  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
>  at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:119)
>  at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
>  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  at java.lang.Thread.run(Thread.java:748)
>  )
>  at org.apache.hadoop.hive.ql.metadata.Hive.setPartitionColumnStatistics(Hive.java:4401)
~[hive-exec-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.ql.stats.ColStatsProcessor.persistColumnStats(ColStatsProcessor.java:179)
~[hive-exec-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.ql.stats.ColStatsProcessor.process(ColStatsProcessor.java:83)
~[hive-exec-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.ql.exec.StatsTask.execute(StatsTask.java:108) ~[hive-exec-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205) ~[hive-exec-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) ~[hive-exec-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664) ~[hive-exec-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335) ~[hive-exec-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011) ~[hive-exec-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709) ~[hive-exec-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703) ~[hive-exec-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) ~[hive-exec-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218) ~[hive-exec-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239) ~[hive-cli-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188) ~[hive-cli-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402) ~[hive-cli-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:335) ~[hive-cli-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:471) ~[hive-cli-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:487) ~[hive-cli-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793) ~[hive-cli-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) ~[hive-cli-3.1.2.jar:3.1.2]
>  at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683) ~[hive-cli-3.1.2.jar:3.1.2]
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_232]
>  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_232]
>  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
~[?:1.8.0_232]
>  at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_232]
>  at org.apache.hadoop.util.RunJar.run(RunJar.java:323) ~[hadoop-common-3.2.1.jar:?]
>  at org.apache.hadoop.util.RunJar.main(RunJar.java:236) ~[hadoop-common-3.2.1.jar:?]{noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Mime
View raw message