lens-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "amareshwarisr ." <amareshw...@gmail.com>
Subject Re: Htrace Error
Date Wed, 03 Jun 2015 04:53:05 GMT
On Tue, Jun 2, 2015 at 10:27 PM, Srinivasan Hariharan <
srinivasan.hariharan@outlook.com> wrote:

>
> Yes I am trying to connect hiveserver to JDBC.
>
We have HiveDriver which connects to HiveServer2 through thrift api.
Connecting to HiveServer2 thru JDBCDriver is something that is not tried
out till now. But HiveDriver uses RetryingThriftClient which JDBCDriver
will not be using.


> Could you please let me know how to populates the necessary tables in case
> if we are using mysql as len server database instead of embedded derby DB.
> I couldn't find any scripts to create schemas for the tables used to store
> executed queries.
>
> Using mysql DB as lens server DB should be fine. We are using the same in
our production.  FinishedQueries table gets created when lens server
startsup, if it is already not there. Schema for the same is here -
https://github.com/apache/incubator-lens/blob/master/lens-server/src/main/java/org/apache/lens/server/query/LensServerDAO.java#L94

Thanks
Amareshwari

> ------------------------------
> From: yash360@gmail.com
> Date: Tue, 2 Jun 2015 10:25:13 +0530
> Subject: Re: Htrace Error
> To: user@lens.incubator.apache.org
>
>
> For hadoop 2.6 you might find htrace jar in the locations -
>
> ${HADOOP_HOME}/share/hadoop/common/lib/htrace-core-3.0.4.jar
> ${HADOOP_HOME}/share/hadoop/tools/lib/htrace-core-3.0.4.jar
>
> You can add them to classpath as suggested by Amareshwari.
>
> On Tue, Jun 2, 2015 at 10:17 AM, amareshwarisr . <amareshwari@gmail.com>
> wrote:
>
> Hello Srinivasan,
>
> NoClassDefFoundError is because of class could not loaded at runtime. See
>
> http://stackoverflow.com/questions/34413/why-am-i-getting-a-noclassdeffounderror-in-java
> .
>
> Lens is loading protobuf and htace from below paths.
>
> ----
>     LIB_JARS=$LIB_JARS:`ls ${HADOOP_HOME}/lib/commons-configuration-*.jar
> 2>/dev/null | tr "\n" ':' 2>/dev/null`
>     LIB_JARS=$LIB_JARS:`ls ${HADOOP_HOME}/lib/protobuf-java-*.jar 2>/dev/null
> | tr "\n" ':' 2>/dev/null`
>     LIB_JARS=$LIB_JARS:`ls ${HADOOP_HOME}/lib/htrace-core-*.jar 2>/dev/null
> | tr "\n" ':' 2>/dev/null`
>     LIB_JARS=$LIB_JARS:`ls ${HADOOP_HOME}/share/hadoop/common/lib/commons-configuration-*.jar
> 2>/dev/null | tr "\n" ':' 2>/dev/null`
>     LIB_JARS=$LIB_JARS:`ls ${HADOOP_HOME}/share/hadoop/hdfs/lib/protobuf-java-*.jar
> 2>/dev/null | tr "\n" ':' 2>/dev/null`
> ----
> Can you check if you have them in those paths. also, check if there is
> more than one htrace or protobuf jar in classpath.
>
> If not, you can pass correct jar file by passing --classpath <pathtojar>
> while bringing up LensServer.
>
> Just curious, Are you trying to connect to HiveServer through JDBC?
>
> Thanks
> Amareshwari
>
> On Mon, Jun 1, 2015 at 11:33 PM, Srinivasan Hariharan <
> srinivasan.hariharan@outlook.com> wrote:
>
> Adding to that I am getting this error while using below hivedriversit
> configuration. I am not getting this error while using default config.
>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>
> <configuration>
>   <property>
>     <name>lens.driver.jdbc.driver.class</name>
>     <value>com.mysql.jdbc.Driver</value>
>   </property>
>   <property>
>     <name>lens.driver.jdbc.db.uri</name>
>     <value>jdbc:mysql://192.168.118.128/lensdriver;MODE=MYSQL
> <http://192.168.118.128/lensdriver%3bMODE=MYSQL></value>
>   </property>
>   <property>
>     <name>lens.driver.jdbc.db.user</name>
>     <value>xxxx</value>
>   </property>
>   <property>
>      <name>lens.driver.jdbc.db.password</name>
>      <value>xxxx</value>
>   </property>
>   <property>
>     <name>lens.cube.query.driver.supported.storages</name>
>     <value>mydb</value>
>     <final>true</final>
>   </property>
>   <property>
>     <name>lens.driver.jdbc.query.rewriter</name>
>     <value>org.apache.lens.driver.jdbc.ColumnarSQLRewriter</value>
>   </property>
>   <property>
>     <name>lens.driver.jdbc.explain.keyword</name>
>     <value>explain plan for </value>
>   </property>
> </configuration>
>
>
>
>
> ------------------------------
> From: srinivasan.hariharan@outlook.com
> To: user@lens.incubator.apache.org
> Subject: RE: Htrace Error
> Date: Mon, 1 Jun 2015 23:24:30 +0530
>
>
> Hadoop 2.6
>
> Date: Mon, 1 Jun 2015 22:57:56 +0530
> Subject: Re: Htrace Error
> From: arshad.matin@inmobi.com
> To: user@lens.incubator.apache.org
>
> which hadoop version are you using?
>
> On Mon, Jun 1, 2015 at 9:28 PM, Srinivasan Hariharan <
> srinivasan.hariharan@outlook.com> wrote:
>
>
>
> Hi,
>
> I am getting the below error message when trying to create storage from
> lens client shell. Could anyone help to troubleshoot this error.
>
> ERROR org.apache.hadoop.hive.metastore.RetryingHMSHandler  -
> java.lang.NoClassDefFoundError: org/htrace/Trace
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:214)
>         at com.sun.proxy.$Proxy59.getFileInfo(Unknown Source)
>         at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:497)
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>         at com.sun.proxy.$Proxy60.getFileInfo(Unknown Source)
>         at
> org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
>         at
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
>         at
> org.apache.hadoop.hive.metastore.Warehouse.isDir(Warehouse.java:485)
>         at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1238)
>         at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1295)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:497)
>         at
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
>         at
> com.sun.proxy.$Proxy8.create_table_with_environment_context(Unknown Source)
>         at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:558)
>         at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:497)
>         at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>         at com.sun.proxy.$Proxy9.createTable(Unknown Source)
>         at
> org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)
>         at
> org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:576)
>         at
> org.apache.lens.cube.metadata.CubeMetastoreClient.createCubeHiveTable(CubeMetastoreClient.java:421)
>         at
> org.apache.lens.cube.metadata.CubeMetastoreClient.createStorage(CubeMetastoreClient.java:431)
>         at
> org.apache.lens.server.metastore.CubeMetastoreServiceImpl.createStorage(CubeMetastoreServiceImpl.java:938)
>         at
> org.apache.lens.server.metastore.MetastoreResource.createNewStorage(MetastoreResource.java:462)
>
>
>
> _____________________________________________________________
> The information contained in this communication is intended solely for the
> use of the individual or entity to whom it is addressed and others
> authorized to receive it. It may contain confidential or legally privileged
> information. If you are not the intended recipient you are hereby notified
> that any disclosure, copying, distribution or taking any action in reliance
> on the contents of this information is strictly prohibited and may be
> unlawful. If you have received this communication in error, please notify
> us immediately by responding to this email and then delete it from your
> system. The firm is neither liable for the proper and complete transmission
> of the information contained in this communication nor for any delay in its
> receipt.
>
>
>
>

Mime
View raw message