spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Shridhar Ramachandran (JIRA)" <j...@apache.org>
Subject [jira] [Comment Edited] (SPARK-5159) Thrift server does not respect hive.server2.enable.doAs=true
Date Wed, 01 Mar 2017 07:29:46 GMT

    [ https://issues.apache.org/jira/browse/SPARK-5159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15889677#comment-15889677
] 

Shridhar Ramachandran edited comment on SPARK-5159 at 3/1/17 7:29 AM:
----------------------------------------------------------------------

I am facing this issue as well, on both 1.6 and 2.0. Some solutions have indicated setting
hive.metastore.execute.setugi to true on the metastore as well as the thrift server, but this
did not help.


was (Author: shridharama):
I have faced this issue as well, on both 1.6 and 2.0. Some solutions have indicated setting
hive.metastore.execute.setugi to true on the metastore as well as the thrift server, but this
did not help.

> Thrift server does not respect hive.server2.enable.doAs=true
> ------------------------------------------------------------
>
>                 Key: SPARK-5159
>                 URL: https://issues.apache.org/jira/browse/SPARK-5159
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.2.0
>            Reporter: Andrew Ray
>         Attachments: spark_thrift_server_log.txt
>
>
> I'm currently testing the spark sql thrift server on a kerberos secured cluster in YARN
mode. Currently any user can access any table regardless of HDFS permissions as all data is
read as the hive user. In HiveServer2 the property hive.server2.enable.doAs=true causes all
access to be done as the submitting user. We should do the same.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message