hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Matt McCline (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (HIVE-19689) Not able to do insert into table belonging to a non default namespace - HDFS federated cluster
Date Mon, 16 Jul 2018 09:28:00 GMT

     [ https://issues.apache.org/jira/browse/HIVE-19689?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Matt McCline resolved HIVE-19689.
---------------------------------
    Resolution: Duplicate

HIVE-20091

> Not able to do insert into table belonging to a non default namespace - HDFS federated
cluster
> ----------------------------------------------------------------------------------------------
>
>                 Key: HIVE-19689
>                 URL: https://issues.apache.org/jira/browse/HIVE-19689
>             Project: Hive
>          Issue Type: Bug
>          Components: HiveServer2
>            Reporter: Supreeth Sharma
>            Priority: Blocker
>
> Not able to do insert into table belonging to a non default namespace in HDFS federated
cluster.
> Steps to reproduce :
> 1) Create a HDFS federated cluster with 2 namespaces
> 2) Create an external table belonging to non-default namespace.
> {code:java}
> CREATE EXTERNAL TABLE test_ext_tbl2 (id int, name string, dept string) PARTITIONED BY
(year int) location 'hdfs://ns2/tmp/test_ext_tbl2'
> {code}
> 3) Try to insert a row into the newly created table.
> {code:java}
> INSERT INTO test_ext_tbl2 PARTITION (year=2016) VALUES (8,'Henry','CSE');
> {code}
> The query is hung and after some time its failing with below error :
> {code:java}
> ERROR : Vertex failed, vertexName=Map 1, vertexId=vertex_1527031638037_0017_1_00, diagnostics=[Task
failed, taskId=task_1527031638037_0017_1_00_000000, diagnostics=[TaskAttempt 0 failed, info=[Error:
Error while running task ( failure ) : attempt_1527031638037_0017_1_00_000000_0:java.lang.RuntimeException:
java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime
Error while processing writable (null)
> 	at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:296)
> 	at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:250)
> 	at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)
> 	at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)
> 	at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1682)
> 	at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)
> 	at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)
> 	at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
> 	at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:108)
> 	at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:41)
> 	at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:77)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 	at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException:
Hive Runtime Error while processing writable (null)
> 	at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:101)
> 	at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.pushRecord(MapRecordSource.java:76)
> 	at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.run(MapRecordProcessor.java:419)
> 	at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:267)
> 	... 16 more
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while
processing writable (null)
> 	at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:563)
> 	at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:92)
> 	... 19 more
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.ql.metadata.HiveException:
java.io.IOException: DestHost:destPort ctr-e138-1518143905142-326063-01-000006.hwx.site:8020
, LocalHost:localPort ctr-e138-1518143905142-326063-01-000006.hwx.site/172.27.67.65:0. Failed
on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException:
Client cannot authenticate via:[TOKEN, KERBEROS]
> 	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:708)
> 	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:863)
> 	at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:985)
> 	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:931)
> 	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:918)
> 	at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95)
> 	at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:985)
> 	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:931)
> 	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:918)
> 	at org.apache.hadoop.hive.ql.exec.UDTFOperator.forwardUDTFOutput(UDTFOperator.java:133)
> 	at org.apache.hadoop.hive.ql.udf.generic.UDTFCollector.collect(UDTFCollector.java:45)
> 	at org.apache.hadoop.hive.ql.udf.generic.GenericUDTF.forward(GenericUDTF.java:110)
> 	at org.apache.hadoop.hive.ql.udf.generic.GenericUDTFInline.process(GenericUDTFInline.java:64)
> 	at org.apache.hadoop.hive.ql.exec.UDTFOperator.process(UDTFOperator.java:116)
> 	at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:985)
> 	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:931)
> 	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:918)
> 	at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95)
> 	at org.apache.hadoop.hive.ql.exec.Operator.baseForward(Operator.java:985)
> 	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:931)
> 	at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:125)
> 	at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:148)
> 	at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:550)
> 	... 20 more
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.io.IOException: DestHost:destPort
ctr-e138-1518143905142-326063-01-000006.hwx.site:8020 , LocalHost:localPort ctr-e138-1518143905142-326063-01-000006.hwx.site/172.27.67.65:0.
Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException:
Client cannot authenticate via:[TOKEN, KERBEROS]
> 	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:766)
> 	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:697)
> 	... 42 more
> Caused by: java.io.IOException: DestHost:destPort ctr-e138-1518143905142-326063-01-000006.hwx.site:8020
, LocalHost:localPort ctr-e138-1518143905142-326063-01-000006.hwx.site/172.27.67.65:0. Failed
on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException:
Client cannot authenticate via:[TOKEN, KERBEROS]
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> 	at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:831)
> 	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:806)
> 	at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1503)
> 	at org.apache.hadoop.ipc.Client.call(Client.java:1445)
> 	at org.apache.hadoop.ipc.Client.call(Client.java:1355)
> 	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)
> 	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
> 	at com.sun.proxy.$Proxy21.getFileInfo(Unknown Source)
> 	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:900)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
> 	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
> 	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
> 	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
> 	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
> 	at com.sun.proxy.$Proxy22.getFileInfo(Unknown Source)
> 	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1654)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1577)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1574)
> 	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1589)
> 	at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1734)
> 	at org.apache.hadoop.fs.FileSystem.deleteOnExit(FileSystem.java:1677)
> 	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:729)
> 	... 43 more
> Caused by: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client
cannot authenticate via:[TOKEN, KERBEROS]
> 	at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:756)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1682)
> 	at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:719)
> 	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:812)
> 	at org.apache.hadoop.ipc.Client$Connection.access$3600(Client.java:410)
> 	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1560)
> 	at org.apache.hadoop.ipc.Client.call(Client.java:1391)
> 	... 66 more
> Caused by: org.apache.hadoop.security.AccessControlException: Client cannot authenticate
via:[TOKEN, KERBEROS]
> 	at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:173)
> 	at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:390)
> 	at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:614)
> 	at org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:410)
> 	at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:799)
> 	at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:795)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1682)
> 	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:795)
> 	... 69 more
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message