lens-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From <Sayantan.R...@cognizant.com>
Subject RE: Error in installing Lens: BUILD error when trying to compile hive-release-2.1.3-inm-fix on CDH 5.8 on AWS
Date Thu, 15 Sep 2016 13:07:59 GMT
Hi,

Thanks for all your support, most issues are sorted out, I have a couple of small trivial
issues though.

Issue 1:
When I try to login to Lens via UI, its asking for user email and password. I want to disable
security (if possible and allow anyone to login without providing credentials) OR how and
where do I set up the user credentials?

Issue 2:
The run-examples are failing to work properly, it fails to create the data partition and fails
to insert data /build cubes.

Can you please help?

Regards
Sayantan

From: Raha, Sayantan (Cognizant)
Sent: Thursday, September 15, 2016 1:03 PM
To: 'user@lens.apache.org'
Subject: RE: Error in installing Lens: BUILD error when trying to compile hive-release-2.1.3-inm-fix
on CDH 5.8 on AWS

Hi,

Thanks to all of you Lens server is up now.
Adding the hive Uris and adding all HBASE jars to the classpath sorted out the remaining issues.

I still do have an issue, though lens is up it is not accepting the URL request. When I try
to open up the baseurl/session: I don’t see anything, the log shows a 404 response. Any
idea?

I have the following uri entry in lens-site
</property>
  <name>lens.server.ui.base.uri</name>
  <value>http://servername:19999</value<http://servername:19999%3c/value>>
  <description>Thrift URI for the remote metastore. Used by metastore client to connect
to remote metastore.</description>
</property>

15 Sep 2016 07:00:19 [1d932b51-0423-426d-8e05-0a26d3494209] [grizzly-http-server-0] INFO 
org.glassfish.jersey.filter.LoggingFilter - 1 * Server has received a request on thread grizzly-http-server-0
1 > GET http://localhost:19999/session
1 > accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
1 > accept-encoding: gzip, deflate, sdch
1 > accept-language: en-GB,en-US;q=0.8,en;q=0.6
1 > connection: keep-alive
1 > host: localhost:19999
1 > upgrade-insecure-requests: 1
1 > user-agent: Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/50.0.2661.94
Safari/537.36

15 Sep 2016 07:00:19 [3e26cf93-33f4-45ab-a005-1f7dcb58d90a] [grizzly-http-server-0] INFO 
org.glassfish.jersey.filter.LoggingFilter - 1 * Server responded with a response on thread
grizzly-http-server-0
1 < 404

Also when I run the run-examples.sh the table creations take place OK, (I can see the tables
from hive) but the data population and the following queries all fail. Something is still
not right.

Thanks and regards
Sayantan

From: Raha, Sayantan (Cognizant)
Sent: Thursday, September 15, 2016 11:22 AM
To: user@lens.apache.org<mailto:user@lens.apache.org>
Subject: RE: Error in installing Lens: BUILD error when trying to compile hive-release-2.1.3-inm-fix
on CDH 5.8 on AWS

Hi,

I understand what you are pointing to, it might be the issue, can you kindly help me resolve
it. I will elaborate what is the current set up and what’s going on.

We have 2 Hive_metastores running (One default CDH 1.1 and the other one I installed and configured
yesterday for Lens).
Default Hive is on Derby and uses embedded Metastore. The new Hive 0.13 uses MySQL distributed
configuration. I believe Lens is referring to the 1st Hive Metastore and not the newly set-up
metastore.

I have set up Hive_Home in lens_config.sh

I have checked the following link for lens-site.xml config. < https://lens.apache.org/admin/config.html
>
I am not finding anything else to point to  hive.metastore.uris that you are referring  to.


Hive-site.xml does have hive.metastore.uris set up. I can connect to Hive using beeline /hive.

<property>
  <name>hive.metastore.uris</name>
  <value>thrift://xxxxxxx:9083</value>
  <description>Thrift URI for the remote metastore. Used by metastore client to connect
to remote metastore.</description>
</property>
Please help, I believe I am close, just missing a few steps may be.

Regards
Sayantan
From: Rajat Khandelwal [mailto:rajatgupta59@gmail.com]
Sent: Wednesday, September 14, 2016 5:48 PM
To: user@lens.apache.org<mailto:user@lens.apache.org>
Subject: Re: Error in installing Lens: BUILD error when trying to compile hive-release-2.1.3-inm-fix
on CDH 5.8 on AWS

Not sure, but it's possible that you are connecting to the hive metastore running with 2.1.x
version. Are you passing hive.metastore.uris in lens-site.xml?

On Wed, Sep 14, 2016 at 4:04 PM <Sayantan.Raha@cognizant.com<mailto:Sayantan.Raha@cognizant.com>>
wrote:
Hi,

Thanks I have made some progress.
1. Download and compile Hive (0.13.4)
2. Point this as Hive Home
3. Start Lens Server
I get the following exception. Can you please help?

n mysql: Lexical error at line 1, column 5.  Encountered: "@" (64), after : "".
14 Sep 2016 10:32:50 [28addf5e-2587-491d-99db-95741409eae0] [main] INFO  DataNucleus.Datastore
- The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only"
so does not have its own datastore table.
14 Sep 2016 10:32:50 [28addf5e-2587-491d-99db-95741409eae0] [main] INFO  DataNucleus.Datastore
- The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so
does not have its own datastore table.
14 Sep 2016 10:32:50 [28addf5e-2587-491d-99db-95741409eae0] [main] INFO  DataNucleus.Datastore
- The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only"
so does not have its own datastore table.
14 Sep 2016 10:32:50 [28addf5e-2587-491d-99db-95741409eae0] [main] INFO  DataNucleus.Datastore
- The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so
does not have its own datastore table.
14 Sep 2016 10:32:50 [28addf5e-2587-491d-99db-95741409eae0] [main] INFO  org.apache.hadoop.hive.metastore.ObjectStore
- Initialized ObjectStore
14 Sep 2016 10:32:50 [28addf5e-2587-491d-99db-95741409eae0] [main] ERROR org.apache.lens.server.LensServer
- Error while creating Lens server
java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:349) ~[hive-exec-0.13.4-inm.jar:0.13.4-inm]
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:293) ~[hive-exec-0.13.4-inm.jar:0.13.4-inm]
        at org.apache.hive.service.cli.session.SessionManager.applyAuthorizationConfigPolicy(SessionManager.java:128)
~[hive-service-0.13.4-inm.jar:0.13.4-inm]
        at org.apache.hive.service.cli.session.SessionManager.init(SessionManager.java:76)
~[hive-service-0.13.4-inm.jar:0.13.4-inm]
        at org.apache.hive.service.CompositeService.init(CompositeService.java:59) ~[hive-service-0.13.4-inm.jar:0.13.4-inm]
        at org.apache.hive.service.cli.CLIService.init(CLIService.java:112) ~[hive-service-0.13.4-inm.jar:0.13.4-inm]
        at org.apache.hive.service.CompositeService.init(CompositeService.java:59) ~[hive-service-0.13.4-inm.jar:0.13.4-inm]
        at org.apache.lens.server.LensServices.init(LensServices.java:235) ~[classes/:na]
        at org.apache.lens.server.LensServer.startServices(LensServer.java:134) ~[classes/:na]
        at org.apache.lens.server.LensServer.<init>(LensServer.java:85) ~[classes/:na]
        at org.apache.lens.server.LensServer.createLensServer(LensServer.java:74) ~[classes/:na]
        at org.apache.lens.server.LensServer.main(LensServer.java:190) ~[classes/:na]
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
~[hive-metastore-0.13.4-inm.jar:0.13.4-inm]
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
~[hive-metastore-0.13.4-inm.jar:0.13.4-inm]
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
~[hive-metastore-0.13.4-inm.jar:0.13.4-inm]
        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2472) ~[hive-exec-0.13.4-inm.jar:0.13.4-inm]
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2484) ~[hive-exec-0.13.4-inm.jar:0.13.4-inm]
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:343) ~[hive-exec-0.13.4-inm.jar:0.13.4-inm]
        ... 11 common frames omitted
Caused by: java.lang.reflect.InvocationTargetException: null
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.7.0_67]
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
~[na:1.7.0_67]
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
~[na:1.7.0_67]
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526) ~[na:1.7.0_67]
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
~[hive-metastore-0.13.4-inm.jar:0.13.4-inm]
        ... 16 common frames omitted

Regards
Sayantan
From: Rajat Khandelwal [mailto:rajatgupta59@gmail.com<mailto:rajatgupta59@gmail.com>]
Sent: Wednesday, September 14, 2016 2:37 PM

To: user@lens.apache.org<mailto:user@lens.apache.org>
Subject: Re: Error in installing Lens: BUILD error when trying to compile hive-release-2.1.3-inm-fix
on CDH 5.8 on AWS

Hi Sayantan

Lens 2.5 works with a forked version of hive. You can get it from here: https://github.com/InMobi/hive/releases/tag/hive-release-0.13.4-inm

On Wed, Sep 14, 2016 at 2:25 PM <Sayantan.Raha@cognizant.com<mailto:Sayantan.Raha@cognizant.com>>
wrote:
We have Lens 2.5. For any other version of Apache Hive (we have tried with 0.13, 1.1 (Default
CDH distribution with CDH 5.8)) we get the following exception: Lens Server does not come
up either.
vi lensserver.out.2016091408511473843076
Exception in thread "main" java.lang.NoSuchFieldError: HIVE_SESSION_IMPL_CLASSNAME
        at org.apache.lens.server.LensServices.init(LensServices.java:183)
        at org.apache.lens.server.LensServer.startServices(LensServer.java:134)
        at org.apache.lens.server.LensServer.<init>(LensServer.java:85)
        at org.apache.lens.server.LensServer.createLensServer(LensServer.java:74)
        at org.apache.lens.server.LensServer.main(LensServer.java:190)

Thanks and regards
Sayantan


From: amareshwarisr . [mailto:amareshwari@gmail.com<mailto:amareshwari@gmail.com>]
Sent: Wednesday, September 14, 2016 2:17 PM

To: user@lens.apache.org<mailto:user@lens.apache.org>
Subject: Re: Error in installing Lens: BUILD error when trying to compile hive-release-2.1.3-inm-fix
on CDH 5.8 on AWS

Sayantan,

Lens release <=2.5, work with hive-0.13.*; and 2.6 (which is in works) onwards will work
with hive-2.1.x.

2.6 release should be out in a couple of weeks, till then you can build lens from source,
if required.

Thanks

On Wed, Sep 14, 2016 at 2:07 PM, <Sayantan.Raha@cognizant.com<mailto:Sayantan.Raha@cognizant.com>>
wrote:
I am using Java version:

java version "1.7.0_67"
Java(TM) SE Runtime Environment (build 1.7.0_67-b01)
Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, mixed mode)

I tried using the Apache distribution for Hive 2.1.0. Pointed HIVE_HOME to the install directory.
Then when I tried of bring up Lens I got the following exception in “Lensserver.out..”
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/ubuntu/apache-lens-2.5.0-beta-bin/server/webapp/lens-server/WEB-INF/lib/logback-classic-1.1.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/ubuntu/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hive.service.cli.CLIService:
method <init>()V not found
        at org.apache.lens.server.LensServices.init(LensServices.java:186)
        at org.apache.lens.server.LensServer.startServices(LensServer.java:134)
        at org.apache.lens.server.LensServer.<init>(LensServer.java:85)
        at org.apache.lens.server.LensServer.createLensServer(LensServer.java:74)
        at org.apache.lens.server.LensServer.main(LensServer.java:190)
Server is not coming up.  The other logs have no other details either. Hence I tried to recompile
Hive, which also is not working.
I have tried with common collections 3.2.2 as well. I have the same error.

I ran the following command which does confirm that “'UnmodifiableMap.class' is in the jar;

$jar tf commons-collections-3.2.2.jar |grep 'UnmodifiableMap'
org/apache/commons/collections/keyvalue/UnmodifiableMapEntry.class
org/apache/commons/collections/map/UnmodifiableMap.class
org/apache/commons/collections/iterators/UnmodifiableMapIterator.class

Please let me know if there is a way to fix this / we need to wait till Lens with Hive distributions
comes out?

Also, can you kindly clarify if CDH is a supported platform for Lens? Why I am asking this
question is CDH changes the install path and the directory structure for Hadoop distribution.
Hence the Lens-ctl which has paths to various distribution jars don’t work at all on CDH.
 I had to manually change all of those to reflect proper paths in CDH (not sure whether I
should do this).

Thanks for your help.

Regards
Sayantan

From: amareshwarisr . [mailto:amareshwari@gmail.com<mailto:amareshwari@gmail.com>]
Sent: Wednesday, September 14, 2016 1:42 PM
To: user@lens.apache.org<mailto:user@lens.apache.org>
Subject: Re: Error in installing Lens: BUILD error when trying to compile hive-release-2.1.3-inm-fix
on CDH 5.8 on AWS

Not sure which version of java you are using. Downloading hive release 2.1.0 directly from
apache should work for you.

On Wed, Sep 14, 2016 at 12:47 PM, Puneet Gupta <puneet.gupta@inmobi.com<mailto:puneet.gupta@inmobi.com>>
wrote:
Hi Sayantan

I can see commons-collections4 on your class path which does not have "org/apache/commons/collections/map/UnmodifiableMap".
 This class is present in commons-collections3.

I generally compile lens/hive the code with HDP hadoop-2.6.0.2.2.0.0-2041 and above

PS : We are moving to apache hive in lens version 2.6 which should be out in a few weeks.
You can find more info about in on lens user mailing list.


Thanks,
Puneet Gupta

On Wed, Sep 14, 2016 at 12:09 PM, <Sayantan.Raha@cognizant.com<mailto:Sayantan.Raha@cognizant.com>>
wrote:
I have tried all the following versions of hive. I have the same issue for all of them.

1.       hive-release-2.1.3-inm-fix
2.       hive-release-2.1.3-inm
3.       hive-release-2.1.0-inm

Thanks and regards
Sayantan Raha


From: Rajat Khandelwal [mailto:rajatgupta59@gmail.com<mailto:rajatgupta59@gmail.com>]
Sent: Wednesday, September 14, 2016 12:03 PM
To: user@lens.apache.org<mailto:user@lens.apache.org>
Subject: Re: Error in installing Lens: BUILD error when trying to compile hive-release-2.1.3-inm-fix
on CDH 5.8 on AWS

Please download version 2.1.3-inm instead of 2.1.3-inm-fix

On Wed, Sep 14, 2016 at 11:55 AM <Sayantan.Raha@cognizant.com<mailto:Sayantan.Raha@cognizant.com>>
wrote:
Hi,

I am facing a small issue when I am trying to compile “hive-release-2.1.3-inm-fix” on
Cloudera Distribution 5.8.

Steps:

1.       Download

2.       tar -xvf hive-release-2.1.3-inm-fix.tar.gz

3.       cd hive-hive-release-2.1.3-inm-fix/

4.       mvn clean package -DskipTests -Phadoop-2,dist

Error received:

[INFO] ------------------------------------------------------------------------
[ERROR] BUILD ERROR
[INFO] ------------------------------------------------------------------------
[INFO] An Ant BuildException has occured: java.lang.NoClassDefFoundError: org/apache/commons/collections/map/UnmodifiableMap
around Ant part ...<templategen templateFile="/home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/../conf/hive-default.xml.template"/>...
@ 6:118 in /home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/target/antrun/build-main.xml

org.apache.commons.collections.map.UnmodifiableMap

Complete Build Log:
[INFO] Scanning for projects...
[WARNING]
        Profile with id: 'hadoop-2' has not been activated.

[INFO] Reactor build order:
[INFO]   Hive
[INFO]   Hive Shims Common
[INFO]   Hive Shims 0.23
[INFO]   Hive Shims Scheduler
[INFO]   Hive Shims
[INFO]   Hive Storage API
[INFO]   Hive ORC
[INFO]   Hive Common
[INFO]   Hive Service RPC
[INFO]   Hive Serde
[INFO]   Hive Metastore
[INFO]   Hive Ant Utilities
[INFO]   Hive Llap Common
[INFO]   Hive Llap Client
[INFO]   Hive Llap Tez
[INFO]   Spark Remote Client
[INFO]   Hive Query Language
[INFO]   Hive Llap Server
[INFO]   Hive Service
[INFO]   Hive Accumulo Handler
[INFO]   Hive JDBC
[INFO]   Hive Beeline
[INFO]   Hive CLI
[INFO]   Hive Contrib
[INFO]   Hive HBase Handler
[INFO]   Hive HCatalog
[INFO]   Hive HCatalog Core
[INFO]   Hive HCatalog Pig Adapter
[INFO]   Hive HCatalog Server Extensions
[INFO]   Hive HCatalog Webhcat Java Client
[INFO]   Hive HCatalog Webhcat
[INFO]   Hive HCatalog Streaming
[INFO]   Hive HPL/SQL
[INFO]   Hive HWI
[INFO]   Hive Llap External Client
[INFO]   Hive Shims Aggregator
[INFO]   Hive TestUtils
[INFO]   Hive Packaging
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive
[INFO]    task-segment: [clean, package]
[INFO] ------------------------------------------------------------------------
[INFO] [clean:clean {execution: default-clean}]
[INFO] Deleting /home/ubuntu/hive-hive-release-2.1.3-inm-fix (includes = [datanucleus.log,
derby.log], excludes = [])
[INFO] [enforcer:enforce {execution: enforce-no-snapshots}]
-----------------------------------------------------
this realm = app0.child-container[org.apache.maven.plugins:maven-enforcer-plugin:1.3.1]
urls[0] = file:/home/ubuntu/.m2/repository/org/apache/maven/plugins/maven-enforcer-plugin/1.3.1/maven-enforcer-plugin-1.3.1.jar
urls[1] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-utils/1.5.8/plexus-utils-1.5.8.jar
urls[2] = file:/home/ubuntu/.m2/repository/commons-lang/commons-lang/2.3/commons-lang-2.3.jar
urls[3] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-api/1.3.1/enforcer-api-1.3.1.jar
urls[4] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-rules/1.3.1/enforcer-rules-1.3.1.jar
urls[5] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-common-artifact-filters/1.4/maven-common-artifact-filters-1.4.jar
urls[6] = file:/home/ubuntu/.m2/repository/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar
urls[7] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-dependency-tree/2.1/maven-dependency-tree-2.1.jar
urls[8] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
urls[9] = file:/home/ubuntu/.m2/repository/org/eclipse/aether/aether-util/0.9.0.M2/aether-util-0.9.0.M2.jar
urls[10] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-i18n/1.0-beta-6/plexus-i18n-1.0-beta-6.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>


this realm = plexus.core
urls[0] = file:/usr/share/maven2/lib/maven-debian-uber.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>
-----------------------------------------------------
[INFO] [remote-resources:process {execution: default}]
[INFO] [antrun:run {execution: define-classpath}]
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] [antrun:run {execution: setup-test-dirs}]
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/target/tmp
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/target/warehouse
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/target/tmp/conf
     [copy] Copying 15 files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/target/tmp/conf
[INFO] Executed tasks
[INFO] [site:attach-descriptor {execution: default-attach-descriptor}]
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims Common
[INFO]    task-segment: [clean, package]
[INFO] ------------------------------------------------------------------------
[INFO] [clean:clean {execution: default-clean}]
[INFO] Deleting /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common (includes = [datanucleus.log,
derby.log], excludes = [])
[INFO] [enforcer:enforce {execution: enforce-no-snapshots}]
-----------------------------------------------------
this realm = app0.child-container[org.apache.maven.plugins:maven-enforcer-plugin:1.3.1]
urls[0] = file:/home/ubuntu/.m2/repository/org/apache/maven/plugins/maven-enforcer-plugin/1.3.1/maven-enforcer-plugin-1.3.1.jar
urls[1] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-utils/1.5.8/plexus-utils-1.5.8.jar
urls[2] = file:/home/ubuntu/.m2/repository/commons-lang/commons-lang/2.3/commons-lang-2.3.jar
urls[3] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-api/1.3.1/enforcer-api-1.3.1.jar
urls[4] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-rules/1.3.1/enforcer-rules-1.3.1.jar
urls[5] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-common-artifact-filters/1.4/maven-common-artifact-filters-1.4.jar
urls[6] = file:/home/ubuntu/.m2/repository/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar
urls[7] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-dependency-tree/2.1/maven-dependency-tree-2.1.jar
urls[8] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
urls[9] = file:/home/ubuntu/.m2/repository/org/eclipse/aether/aether-util/0.9.0.M2/aether-util-0.9.0.M2.jar
urls[10] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-i18n/1.0-beta-6/plexus-i18n-1.0-beta-6.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>


this realm = plexus.core
urls[0] = file:/usr/share/maven2/lib/maven-debian-uber.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>
-----------------------------------------------------
[INFO] [remote-resources:process {execution: default}]
[INFO] [resources:resources {execution: default-resources}]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/src/main/resources
[INFO] Copying 3 resources
[INFO] [antrun:run {execution: define-classpath}]
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] [compiler:compile {execution: default-compile}]
[INFO] Compiling 30 source files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/target/classes
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/src/main/java/org/apache/hadoop/hive/thrift/HadoopThriftAuthBridge.java:
Some input files use or override a deprecated API.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/src/main/java/org/apache/hadoop/hive/thrift/HadoopThriftAuthBridge.java:
Recompile with -Xlint:deprecation for details.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/src/main/java/org/apache/hadoop/hive/shims/Utils.java:
Some input files use unchecked or unsafe operations.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/src/main/java/org/apache/hadoop/hive/shims/Utils.java:
Recompile with -Xlint:unchecked for details.
[INFO] [resources:testResources {execution: default-testResources}]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/src/test/resources
[INFO] Copying 3 resources
[INFO] [antrun:run {execution: setup-test-dirs}]
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/target/tmp
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/target/warehouse
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/target/tmp/conf
     [copy] Copying 15 files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/target/tmp/conf
[INFO] Executed tasks
[INFO] [compiler:testCompile {execution: default-testCompile}]
[INFO] No sources to compile
[INFO] [surefire:test {execution: default-test}]
[INFO] Tests are skipped.
[INFO] [jar:jar {execution: default-jar}]
[INFO] Building jar: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/target/hive-shims-common-2.1.3-inm-fix.jar
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims 0.23
[INFO]    task-segment: [clean, package]
[INFO] ------------------------------------------------------------------------
[INFO] [clean:clean {execution: default-clean}]
[INFO] Deleting /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23 (includes = [datanucleus.log,
derby.log], excludes = [])
[INFO] [enforcer:enforce {execution: enforce-no-snapshots}]
-----------------------------------------------------
this realm = app0.child-container[org.apache.maven.plugins:maven-enforcer-plugin:1.3.1]
urls[0] = file:/home/ubuntu/.m2/repository/org/apache/maven/plugins/maven-enforcer-plugin/1.3.1/maven-enforcer-plugin-1.3.1.jar
urls[1] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-utils/1.5.8/plexus-utils-1.5.8.jar
urls[2] = file:/home/ubuntu/.m2/repository/commons-lang/commons-lang/2.3/commons-lang-2.3.jar
urls[3] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-api/1.3.1/enforcer-api-1.3.1.jar
urls[4] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-rules/1.3.1/enforcer-rules-1.3.1.jar
urls[5] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-common-artifact-filters/1.4/maven-common-artifact-filters-1.4.jar
urls[6] = file:/home/ubuntu/.m2/repository/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar
urls[7] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-dependency-tree/2.1/maven-dependency-tree-2.1.jar
urls[8] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
urls[9] = file:/home/ubuntu/.m2/repository/org/eclipse/aether/aether-util/0.9.0.M2/aether-util-0.9.0.M2.jar
urls[10] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-i18n/1.0-beta-6/plexus-i18n-1.0-beta-6.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>


this realm = plexus.core
urls[0] = file:/usr/share/maven2/lib/maven-debian-uber.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>
-----------------------------------------------------
[INFO] [remote-resources:process {execution: default}]
[INFO] [resources:resources {execution: default-resources}]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/src/main/resources
[INFO] Copying 3 resources
[INFO] [antrun:run {execution: define-classpath}]
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] [compiler:compile {execution: default-compile}]
[INFO] Compiling 5 source files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/target/classes
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java:
/home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java
uses or overrides a deprecated API.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java:
Recompile with -Xlint:deprecation for details.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java:
/home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java
uses unchecked or unsafe operations.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java:
Recompile with -Xlint:unchecked for details.
[INFO] [resources:testResources {execution: default-testResources}]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/src/test/resources
[INFO] Copying 3 resources
[INFO] [antrun:run {execution: setup-test-dirs}]
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/target/tmp
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/target/warehouse
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/target/tmp/conf
     [copy] Copying 15 files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/target/tmp/conf
[INFO] Executed tasks
[INFO] [compiler:testCompile {execution: default-testCompile}]
[INFO] No sources to compile
[INFO] [surefire:test {execution: default-test}]
[INFO] Tests are skipped.
[INFO] [jar:jar {execution: default-jar}]
[INFO] Building jar: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/target/hive-shims-0.23-2.1.3-inm-fix.jar
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims Scheduler
[INFO]    task-segment: [clean, package]
[INFO] ------------------------------------------------------------------------
[INFO] [clean:clean {execution: default-clean}]
[INFO] Deleting /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/scheduler (includes = [datanucleus.log,
derby.log], excludes = [])
[INFO] [enforcer:enforce {execution: enforce-no-snapshots}]
-----------------------------------------------------
this realm = app0.child-container[org.apache.maven.plugins:maven-enforcer-plugin:1.3.1]
urls[0] = file:/home/ubuntu/.m2/repository/org/apache/maven/plugins/maven-enforcer-plugin/1.3.1/maven-enforcer-plugin-1.3.1.jar
urls[1] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-utils/1.5.8/plexus-utils-1.5.8.jar
urls[2] = file:/home/ubuntu/.m2/repository/commons-lang/commons-lang/2.3/commons-lang-2.3.jar
urls[3] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-api/1.3.1/enforcer-api-1.3.1.jar
urls[4] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-rules/1.3.1/enforcer-rules-1.3.1.jar
urls[5] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-common-artifact-filters/1.4/maven-common-artifact-filters-1.4.jar
urls[6] = file:/home/ubuntu/.m2/repository/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar
urls[7] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-dependency-tree/2.1/maven-dependency-tree-2.1.jar
urls[8] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
urls[9] = file:/home/ubuntu/.m2/repository/org/eclipse/aether/aether-util/0.9.0.M2/aether-util-0.9.0.M2.jar
urls[10] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-i18n/1.0-beta-6/plexus-i18n-1.0-beta-6.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>


this realm = plexus.core
urls[0] = file:/usr/share/maven2/lib/maven-debian-uber.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
This e-mail and any files transmitted with it are for the sole use of the intended recipient(s)
and may contain confidential and privileged information. If you are not the intended recipient(s),
please reply to the sender and destroy all copies of the original message. Any unauthorized
review, use, disclosure, dissemination, forwarding, printing or copying of this email, and/or
any action taken in reliance on the contents of this e-mail is strictly prohibited and may
be unlawful. Where permitted by applicable law, this e-mail and other e-mail communications
sent to and from Cognizant e-mail addresses may be monitored.
Mime
View raw message