phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From James Heather <james.heat...@mendeley.com>
Subject Re: Issues while running psql.py localhost command
Date Mon, 21 Sep 2015 08:45:21 GMT
I don't know for certain what that parameter does but it sounds a bit 
scary to me...

On 21/09/15 09:41, rajeshbabu@apache.org wrote:
> You can try adding below property to hbase-site.xml and restart hbase.
> <property>
> <name>hbase.table.sanity.checks</name>
> <value>false</value>
> </property>
>
> Thanks,
> Rajeshbabu.
>
> On Mon, Sep 21, 2015 at 12:51 PM, Ashutosh Sharma 
> <ashu.sharma.india@gmail.com <mailto:ashu.sharma.india@gmail.com>> wrote:
>
>     I am getting into issues while running phoenix psql.py command
>     against my local Hbase instance.
>
>     Local HBase is running perfectly fine. Any help?
>
>     root@ashu-HP-ENVY-15-Notebook-PC:/phoenix-4.5.2-HBase-1.1-bin/bin#
>     ./psql.py localhost
>     /phoenix-4.5.2-HBase-1.1-src/examples/STOCK_SYMBOL.sql
>     15/09/21 00:19:26 WARN util.NativeCodeLoader: Unable to load
>     native-hadoop library for your platform... using builtin-java
>     classes where applicable
>     org.apache.phoenix.exception.PhoenixIOException:
>     org.apache.hadoop.hbase.DoNotRetryIOException: Class
>     org.apache.phoenix.coprocessor.MetaDataEndpointImpl cannot be
>     loaded Set hbase.table.sanity.checks to false at conf or table
>     descriptor if you want to bypass sanity checks
>     at
>     org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:1597)
>     at
>     org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1529)
>     at
>     org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1448)
>     at
>     org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:422)
>     at
>     org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:48502)
>     at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>     at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>     at
>     org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>     at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>     at java.lang.Thread.run(Thread.java:745)
>
>     at
>     org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:108)
>     at
>     org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:889)
>     at
>     org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1223)
>     at
>     org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(DelegateConnectionQueryServices.java:113)
>     at
>     org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1937)
>     at
>     org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:751)
>     at
>     org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:186)
>     at
>     org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:320)
>     at
>     org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:312)
>     at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>     at
>     org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:310)
>     at
>     org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1422)
>     at
>     org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1927)
>     at
>     org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1896)
>     at
>     org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:77)
>     at
>     org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1896)
>     at
>     org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:180)
>     at
>     org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:132)
>     at
>     org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:151)
>     at java.sql.DriverManager.getConnection(DriverManager.java:664)
>     at java.sql.DriverManager.getConnection(DriverManager.java:208)
>     at
>     org.apache.phoenix.util.PhoenixRuntime.main(PhoenixRuntime.java:192)
>     Caused by: org.apache.hadoop.hbase.DoNotRetryIOException:
>     org.apache.hadoop.hbase.DoNotRetryIOException: Class
>     org.apache.phoenix.coprocessor.MetaDataEndpointImpl cannot be
>     loaded Set hbase.table.sanity.checks to false at conf or table
>     descriptor if you want to bypass sanity checks
>     at
>     org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:1597)
>     at
>     org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1529)
>     at
>     org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1448)
>     at
>     org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:422)
>     at
>     org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:48502)
>     at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>     at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>     at
>     org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>     at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>     at java.lang.Thread.run(Thread.java:745)
>
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>     Method)
>     at
>     sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>     at
>     sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>     at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>     at
>     org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>     at
>     org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>     at
>     org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:226)
>     at
>     org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:240)
>     at
>     org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:140)
>     at
>     org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3917)
>     at
>     org.apache.hadoop.hbase.client.HBaseAdmin.createTableAsyncV2(HBaseAdmin.java:636)
>     at
>     org.apache.hadoop.hbase.client.HBaseAdmin.createTable(HBaseAdmin.java:557)
>     at
>     org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:850)
>     ... 20 more
>     Caused by:
>     org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOException):
>     org.apache.hadoop.hbase.DoNotRetryIOException: Class
>     org.apache.phoenix.coprocessor.MetaDataEndpointImpl cannot be
>     loaded Set hbase.table.sanity.checks to false at conf or table
>     descriptor if you want to bypass sanity checks
>     at
>     org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:1597)
>     at
>     org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1529)
>     at
>     org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1448)
>     at
>     org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:422)
>     at
>     org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:48502)
>     at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>     at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>     at
>     org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>     at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>     at java.lang.Thread.run(Thread.java:745)
>
>     at
>     org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1196)
>     at
>     org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
>     at
>     org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
>     at
>     org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.createTable(MasterProtos.java:51086)
>     at
>     org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$4.createTable(ConnectionManager.java:1802)
>     at
>     org.apache.hadoop.hbase.client.HBaseAdmin$4.call(HBaseAdmin.java:641)
>     at
>     org.apache.hadoop.hbase.client.HBaseAdmin$4.call(HBaseAdmin.java:637)
>     at
>     org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
>     ... 24 more
>     root@ashu-HP-ENVY-15-Notebook-PC:/phoenix-4.5.2-HBase-1.1-bin/bin#
>
>
>     -- 
>     With best Regards:
>     Ashutosh Sharma
>
>


Mime
View raw message