phoenix-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Stas Sukhanov (JIRA)" <>
Subject [jira] [Commented] (PHOENIX-3460) Phoenix Spark plugin cannot find table with a Namespace prefix
Date Wed, 13 Sep 2017 10:43:00 GMT


Stas Sukhanov commented on PHOENIX-3460:

[] I wrote message to dev mailing list with more details and suggestions
a few days ago (I haven't checked if there was any feedback yet). The origin of the problem
is that methods getSchemaNameFromFullName and getTableNameFromFullName in class org.apache.phoenix.util,SchemaUtil
do not respect IS_NAMESPACE_MAPPING_ENABLED flag. I am quite sure that bug is still there
(there are no changes with those methods)  but unfortunately reproducing the problem on the
latest version for me a bit hard and I am not going to do that.

> Phoenix Spark plugin cannot find table with a Namespace prefix
> --------------------------------------------------------------
>                 Key: PHOENIX-3460
>                 URL:
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 4.8.0
>         Environment: HDP 2.5
>            Reporter: Xindian Long
>              Labels: namespaces, phoenix, spark
>             Fix For: 4.7.0
> I am testing some code using Phoenix Spark plug in to read a Phoenix table with a namespace
prefix in the table name (the table is created as a phoenix table not a hbase table), but
it returns an TableNotFoundException.
> The table is obviously there because I can query it using plain phoenix sql through Squirrel.
In addition, using spark sql to query it has no problem at all.
> I am running on the HDP 2.5 platform, with phoenix
> The problem does not exist at all when I was running the same code on HDP 2.4 cluster,
with phoenix 4.4.
> Neither does the problem occur when I query a table without a namespace prefix in the
DB table name, on HDP 2.5
> The log is in the attached file: tableNoFound.txt
> My testing code is also attached.
> The weird thing is in the attached code, if I run testSpark alone it gives the above
exception, but if I run the testJdbc first, and followed by testSpark, both of them work.
>  After changing to create table by using
> The phoenix-spark plug in seems working. I also find some weird behavior,
> If I do both the following
> create table ACME.ENDPOINT_STATUS ...
> create table "ACME:ENDPOINT_STATUS" ...
> Both table shows up in phoenix, the first one shows as Schema ACME, and table name ENDPOINT_STATUS,
and the later on shows as scheme none, and table name ACME:ENDPOINT_STATUS.
> However, in HBASE, I only see one table ACME:ENDPOINT_STATUS. In addition, upserts in
the table ACME.ENDPOINT_STATUS show up in the other table, so is the other way around.

This message was sent by Atlassian JIRA

View raw message