lens-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From <Sayantan.R...@cognizant.com>
Subject RE: Error in installing Lens: BUILD error when trying to compile hive-release-2.1.3-inm-fix on CDH 5.8 on AWS
Date Wed, 14 Sep 2016 08:55:28 GMT
We have Lens 2.5. For any other version of Apache Hive (we have tried with 0.13, 1.1 (Default CDH distribution with CDH 5.8)) we get the following exception: Lens Server does not come up either.
vi lensserver.out.2016091408511473843076
Exception in thread "main" java.lang.NoSuchFieldError: HIVE_SESSION_IMPL_CLASSNAME
        at org.apache.lens.server.LensServices.init(LensServices.java:183)
        at org.apache.lens.server.LensServer.startServices(LensServer.java:134)
        at org.apache.lens.server.LensServer.<init>(LensServer.java:85)
        at org.apache.lens.server.LensServer.createLensServer(LensServer.java:74)
        at org.apache.lens.server.LensServer.main(LensServer.java:190)

Thanks and regards
Sayantan


From: amareshwarisr . [mailto:amareshwari@gmail.com]
Sent: Wednesday, September 14, 2016 2:17 PM
To: user@lens.apache.org
Subject: Re: Error in installing Lens: BUILD error when trying to compile hive-release-2.1.3-inm-fix on CDH 5.8 on AWS

Sayantan,

Lens release <=2.5, work with hive-0.13.*; and 2.6 (which is in works) onwards will work with hive-2.1.x.

2.6 release should be out in a couple of weeks, till then you can build lens from source, if required.

Thanks

On Wed, Sep 14, 2016 at 2:07 PM, <Sayantan.Raha@cognizant.com<mailto:Sayantan.Raha@cognizant.com>> wrote:
I am using Java version:

java version "1.7.0_67"
Java(TM) SE Runtime Environment (build 1.7.0_67-b01)
Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, mixed mode)

I tried using the Apache distribution for Hive 2.1.0. Pointed HIVE_HOME to the install directory. Then when I tried of bring up Lens I got the following exception in “Lensserver.out..”
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/ubuntu/apache-lens-2.5.0-beta-bin/server/webapp/lens-server/WEB-INF/lib/logback-classic-1.1.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/ubuntu/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hive.service.cli.CLIService: method <init>()V not found
        at org.apache.lens.server.LensServices.init(LensServices.java:186)
        at org.apache.lens.server.LensServer.startServices(LensServer.java:134)
        at org.apache.lens.server.LensServer.<init>(LensServer.java:85)
        at org.apache.lens.server.LensServer.createLensServer(LensServer.java:74)
        at org.apache.lens.server.LensServer.main(LensServer.java:190)
Server is not coming up.  The other logs have no other details either. Hence I tried to recompile Hive, which also is not working.
I have tried with common collections 3.2.2 as well. I have the same error.

I ran the following command which does confirm that “'UnmodifiableMap.class' is in the jar;

$jar tf commons-collections-3.2.2.jar |grep 'UnmodifiableMap'
org/apache/commons/collections/keyvalue/UnmodifiableMapEntry.class
org/apache/commons/collections/map/UnmodifiableMap.class
org/apache/commons/collections/iterators/UnmodifiableMapIterator.class

Please let me know if there is a way to fix this / we need to wait till Lens with Hive distributions comes out?

Also, can you kindly clarify if CDH is a supported platform for Lens? Why I am asking this question is CDH changes the install path and the directory structure for Hadoop distribution. Hence the Lens-ctl which has paths to various distribution jars don’t work at all on CDH.  I had to manually change all of those to reflect proper paths in CDH (not sure whether I should do this).

Thanks for your help.

Regards
Sayantan

From: amareshwarisr . [mailto:amareshwari@gmail.com<mailto:amareshwari@gmail.com>]
Sent: Wednesday, September 14, 2016 1:42 PM
To: user@lens.apache.org<mailto:user@lens.apache.org>
Subject: Re: Error in installing Lens: BUILD error when trying to compile hive-release-2.1.3-inm-fix on CDH 5.8 on AWS

Not sure which version of java you are using. Downloading hive release 2.1.0 directly from apache should work for you.

On Wed, Sep 14, 2016 at 12:47 PM, Puneet Gupta <puneet.gupta@inmobi.com<mailto:puneet.gupta@inmobi.com>> wrote:
Hi Sayantan

I can see commons-collections4 on your class path which does not have "org/apache/commons/collections/map/UnmodifiableMap".  This class is present in commons-collections3.

I generally compile lens/hive the code with HDP hadoop-2.6.0.2.2.0.0-2041 and above

PS : We are moving to apache hive in lens version 2.6 which should be out in a few weeks. You can find more info about in on lens user mailing list.


Thanks,
Puneet Gupta

On Wed, Sep 14, 2016 at 12:09 PM, <Sayantan.Raha@cognizant.com<mailto:Sayantan.Raha@cognizant.com>> wrote:
I have tried all the following versions of hive. I have the same issue for all of them.

1.       hive-release-2.1.3-inm-fix
2.       hive-release-2.1.3-inm
3.       hive-release-2.1.0-inm

Thanks and regards
Sayantan Raha


From: Rajat Khandelwal [mailto:rajatgupta59@gmail.com<mailto:rajatgupta59@gmail.com>]
Sent: Wednesday, September 14, 2016 12:03 PM
To: user@lens.apache.org<mailto:user@lens.apache.org>
Subject: Re: Error in installing Lens: BUILD error when trying to compile hive-release-2.1.3-inm-fix on CDH 5.8 on AWS

Please download version 2.1.3-inm instead of 2.1.3-inm-fix

On Wed, Sep 14, 2016 at 11:55 AM <Sayantan.Raha@cognizant.com<mailto:Sayantan.Raha@cognizant.com>> wrote:
Hi,

I am facing a small issue when I am trying to compile “hive-release-2.1.3-inm-fix” on Cloudera Distribution 5.8.

Steps:

1.       Download

2.       tar -xvf hive-release-2.1.3-inm-fix.tar.gz

3.       cd hive-hive-release-2.1.3-inm-fix/

4.       mvn clean package -DskipTests -Phadoop-2,dist

Error received:

[INFO] ------------------------------------------------------------------------
[ERROR] BUILD ERROR
[INFO] ------------------------------------------------------------------------
[INFO] An Ant BuildException has occured: java.lang.NoClassDefFoundError: org/apache/commons/collections/map/UnmodifiableMap
around Ant part ...<templategen templateFile="/home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/../conf/hive-default.xml.template"/>... @ 6:118 in /home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/target/antrun/build-main.xml

org.apache.commons.collections.map.UnmodifiableMap

Complete Build Log:
[INFO] Scanning for projects...
[WARNING]
        Profile with id: 'hadoop-2' has not been activated.

[INFO] Reactor build order:
[INFO]   Hive
[INFO]   Hive Shims Common
[INFO]   Hive Shims 0.23
[INFO]   Hive Shims Scheduler
[INFO]   Hive Shims
[INFO]   Hive Storage API
[INFO]   Hive ORC
[INFO]   Hive Common
[INFO]   Hive Service RPC
[INFO]   Hive Serde
[INFO]   Hive Metastore
[INFO]   Hive Ant Utilities
[INFO]   Hive Llap Common
[INFO]   Hive Llap Client
[INFO]   Hive Llap Tez
[INFO]   Spark Remote Client
[INFO]   Hive Query Language
[INFO]   Hive Llap Server
[INFO]   Hive Service
[INFO]   Hive Accumulo Handler
[INFO]   Hive JDBC
[INFO]   Hive Beeline
[INFO]   Hive CLI
[INFO]   Hive Contrib
[INFO]   Hive HBase Handler
[INFO]   Hive HCatalog
[INFO]   Hive HCatalog Core
[INFO]   Hive HCatalog Pig Adapter
[INFO]   Hive HCatalog Server Extensions
[INFO]   Hive HCatalog Webhcat Java Client
[INFO]   Hive HCatalog Webhcat
[INFO]   Hive HCatalog Streaming
[INFO]   Hive HPL/SQL
[INFO]   Hive HWI
[INFO]   Hive Llap External Client
[INFO]   Hive Shims Aggregator
[INFO]   Hive TestUtils
[INFO]   Hive Packaging
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive
[INFO]    task-segment: [clean, package]
[INFO] ------------------------------------------------------------------------
[INFO] [clean:clean {execution: default-clean}]
[INFO] Deleting /home/ubuntu/hive-hive-release-2.1.3-inm-fix (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] [enforcer:enforce {execution: enforce-no-snapshots}]
-----------------------------------------------------
this realm = app0.child-container[org.apache.maven.plugins:maven-enforcer-plugin:1.3.1]
urls[0] = file:/home/ubuntu/.m2/repository/org/apache/maven/plugins/maven-enforcer-plugin/1.3.1/maven-enforcer-plugin-1.3.1.jar
urls[1] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-utils/1.5.8/plexus-utils-1.5.8.jar
urls[2] = file:/home/ubuntu/.m2/repository/commons-lang/commons-lang/2.3/commons-lang-2.3.jar
urls[3] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-api/1.3.1/enforcer-api-1.3.1.jar
urls[4] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-rules/1.3.1/enforcer-rules-1.3.1.jar
urls[5] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-common-artifact-filters/1.4/maven-common-artifact-filters-1.4.jar
urls[6] = file:/home/ubuntu/.m2/repository/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar
urls[7] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-dependency-tree/2.1/maven-dependency-tree-2.1.jar
urls[8] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
urls[9] = file:/home/ubuntu/.m2/repository/org/eclipse/aether/aether-util/0.9.0.M2/aether-util-0.9.0.M2.jar
urls[10] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-i18n/1.0-beta-6/plexus-i18n-1.0-beta-6.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>


this realm = plexus.core
urls[0] = file:/usr/share/maven2/lib/maven-debian-uber.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>
-----------------------------------------------------
[INFO] [remote-resources:process {execution: default}]
[INFO] [antrun:run {execution: define-classpath}]
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] [antrun:run {execution: setup-test-dirs}]
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/target/tmp
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/target/warehouse
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/target/tmp/conf
     [copy] Copying 15 files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/target/tmp/conf
[INFO] Executed tasks
[INFO] [site:attach-descriptor {execution: default-attach-descriptor}]
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims Common
[INFO]    task-segment: [clean, package]
[INFO] ------------------------------------------------------------------------
[INFO] [clean:clean {execution: default-clean}]
[INFO] Deleting /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] [enforcer:enforce {execution: enforce-no-snapshots}]
-----------------------------------------------------
this realm = app0.child-container[org.apache.maven.plugins:maven-enforcer-plugin:1.3.1]
urls[0] = file:/home/ubuntu/.m2/repository/org/apache/maven/plugins/maven-enforcer-plugin/1.3.1/maven-enforcer-plugin-1.3.1.jar
urls[1] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-utils/1.5.8/plexus-utils-1.5.8.jar
urls[2] = file:/home/ubuntu/.m2/repository/commons-lang/commons-lang/2.3/commons-lang-2.3.jar
urls[3] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-api/1.3.1/enforcer-api-1.3.1.jar
urls[4] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-rules/1.3.1/enforcer-rules-1.3.1.jar
urls[5] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-common-artifact-filters/1.4/maven-common-artifact-filters-1.4.jar
urls[6] = file:/home/ubuntu/.m2/repository/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar
urls[7] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-dependency-tree/2.1/maven-dependency-tree-2.1.jar
urls[8] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
urls[9] = file:/home/ubuntu/.m2/repository/org/eclipse/aether/aether-util/0.9.0.M2/aether-util-0.9.0.M2.jar
urls[10] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-i18n/1.0-beta-6/plexus-i18n-1.0-beta-6.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>


this realm = plexus.core
urls[0] = file:/usr/share/maven2/lib/maven-debian-uber.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>
-----------------------------------------------------
[INFO] [remote-resources:process {execution: default}]
[INFO] [resources:resources {execution: default-resources}]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/src/main/resources
[INFO] Copying 3 resources
[INFO] [antrun:run {execution: define-classpath}]
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] [compiler:compile {execution: default-compile}]
[INFO] Compiling 30 source files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/target/classes
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/src/main/java/org/apache/hadoop/hive/thrift/HadoopThriftAuthBridge.java: Some input files use or override a deprecated API.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/src/main/java/org/apache/hadoop/hive/thrift/HadoopThriftAuthBridge.java: Recompile with -Xlint:deprecation for details.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/src/main/java/org/apache/hadoop/hive/shims/Utils.java: Some input files use unchecked or unsafe operations.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/src/main/java/org/apache/hadoop/hive/shims/Utils.java: Recompile with -Xlint:unchecked for details.
[INFO] [resources:testResources {execution: default-testResources}]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/src/test/resources
[INFO] Copying 3 resources
[INFO] [antrun:run {execution: setup-test-dirs}]
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/target/tmp
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/target/warehouse
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/target/tmp/conf
     [copy] Copying 15 files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/target/tmp/conf
[INFO] Executed tasks
[INFO] [compiler:testCompile {execution: default-testCompile}]
[INFO] No sources to compile
[INFO] [surefire:test {execution: default-test}]
[INFO] Tests are skipped.
[INFO] [jar:jar {execution: default-jar}]
[INFO] Building jar: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/common/target/hive-shims-common-2.1.3-inm-fix.jar
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims 0.23
[INFO]    task-segment: [clean, package]
[INFO] ------------------------------------------------------------------------
[INFO] [clean:clean {execution: default-clean}]
[INFO] Deleting /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23 (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] [enforcer:enforce {execution: enforce-no-snapshots}]
-----------------------------------------------------
this realm = app0.child-container[org.apache.maven.plugins:maven-enforcer-plugin:1.3.1]
urls[0] = file:/home/ubuntu/.m2/repository/org/apache/maven/plugins/maven-enforcer-plugin/1.3.1/maven-enforcer-plugin-1.3.1.jar
urls[1] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-utils/1.5.8/plexus-utils-1.5.8.jar
urls[2] = file:/home/ubuntu/.m2/repository/commons-lang/commons-lang/2.3/commons-lang-2.3.jar
urls[3] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-api/1.3.1/enforcer-api-1.3.1.jar
urls[4] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-rules/1.3.1/enforcer-rules-1.3.1.jar
urls[5] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-common-artifact-filters/1.4/maven-common-artifact-filters-1.4.jar
urls[6] = file:/home/ubuntu/.m2/repository/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar
urls[7] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-dependency-tree/2.1/maven-dependency-tree-2.1.jar
urls[8] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
urls[9] = file:/home/ubuntu/.m2/repository/org/eclipse/aether/aether-util/0.9.0.M2/aether-util-0.9.0.M2.jar
urls[10] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-i18n/1.0-beta-6/plexus-i18n-1.0-beta-6.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>


this realm = plexus.core
urls[0] = file:/usr/share/maven2/lib/maven-debian-uber.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>
-----------------------------------------------------
[INFO] [remote-resources:process {execution: default}]
[INFO] [resources:resources {execution: default-resources}]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/src/main/resources
[INFO] Copying 3 resources
[INFO] [antrun:run {execution: define-classpath}]
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] [compiler:compile {execution: default-compile}]
[INFO] Compiling 5 source files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/target/classes
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java uses or overrides a deprecated API.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java: Recompile with -Xlint:deprecation for details.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java uses unchecked or unsafe operations.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java: Recompile with -Xlint:unchecked for details.
[INFO] [resources:testResources {execution: default-testResources}]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/src/test/resources
[INFO] Copying 3 resources
[INFO] [antrun:run {execution: setup-test-dirs}]
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/target/tmp
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/target/warehouse
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/target/tmp/conf
     [copy] Copying 15 files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/target/tmp/conf
[INFO] Executed tasks
[INFO] [compiler:testCompile {execution: default-testCompile}]
[INFO] No sources to compile
[INFO] [surefire:test {execution: default-test}]
[INFO] Tests are skipped.
[INFO] [jar:jar {execution: default-jar}]
[INFO] Building jar: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/0.23/target/hive-shims-0.23-2.1.3-inm-fix.jar
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims Scheduler
[INFO]    task-segment: [clean, package]
[INFO] ------------------------------------------------------------------------
[INFO] [clean:clean {execution: default-clean}]
[INFO] Deleting /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/scheduler (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] [enforcer:enforce {execution: enforce-no-snapshots}]
-----------------------------------------------------
this realm = app0.child-container[org.apache.maven.plugins:maven-enforcer-plugin:1.3.1]
urls[0] = file:/home/ubuntu/.m2/repository/org/apache/maven/plugins/maven-enforcer-plugin/1.3.1/maven-enforcer-plugin-1.3.1.jar
urls[1] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-utils/1.5.8/plexus-utils-1.5.8.jar
urls[2] = file:/home/ubuntu/.m2/repository/commons-lang/commons-lang/2.3/commons-lang-2.3.jar
urls[3] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-api/1.3.1/enforcer-api-1.3.1.jar
urls[4] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-rules/1.3.1/enforcer-rules-1.3.1.jar
urls[5] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-common-artifact-filters/1.4/maven-common-artifact-filters-1.4.jar
urls[6] = file:/home/ubuntu/.m2/repository/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar
urls[7] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-dependency-tree/2.1/maven-dependency-tree-2.1.jar
urls[8] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
urls[9] = file:/home/ubuntu/.m2/repository/org/eclipse/aether/aether-util/0.9.0.M2/aether-util-0.9.0.M2.jar
urls[10] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-i18n/1.0-beta-6/plexus-i18n-1.0-beta-6.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>


this realm = plexus.core
urls[0] = file:/usr/share/maven2/lib/maven-debian-uber.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>
-----------------------------------------------------
[INFO] [remote-resources:process {execution: default}]
[INFO] [resources:resources {execution: default-resources}]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/scheduler/src/main/resources
[INFO] Copying 3 resources
[INFO] [antrun:run {execution: define-classpath}]
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] [compiler:compile {execution: default-compile}]
[INFO] Compiling 1 source file to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/scheduler/target/classes
[INFO] [resources:testResources {execution: default-testResources}]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/scheduler/src/test/resources
[INFO] Copying 3 resources
[INFO] [antrun:run {execution: setup-test-dirs}]
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/scheduler/target/tmp
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/scheduler/target/warehouse
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/scheduler/target/tmp/conf
     [copy] Copying 15 files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/scheduler/target/tmp/conf
[INFO] Executed tasks
[INFO] [compiler:testCompile {execution: default-testCompile}]
[INFO] No sources to compile
[INFO] [surefire:test {execution: default-test}]
[INFO] Tests are skipped.
[INFO] [jar:jar {execution: default-jar}]
[INFO] Building jar: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/scheduler/target/hive-shims-scheduler-2.1.3-inm-fix.jar
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims
[INFO]    task-segment: [clean, package]
[INFO] ------------------------------------------------------------------------
[INFO] [clean:clean {execution: default-clean}]
[INFO] Deleting /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/aggregator (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] [enforcer:enforce {execution: enforce-no-snapshots}]
-----------------------------------------------------
this realm = app0.child-container[org.apache.maven.plugins:maven-enforcer-plugin:1.3.1]
urls[0] = file:/home/ubuntu/.m2/repository/org/apache/maven/plugins/maven-enforcer-plugin/1.3.1/maven-enforcer-plugin-1.3.1.jar
urls[1] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-utils/1.5.8/plexus-utils-1.5.8.jar
urls[2] = file:/home/ubuntu/.m2/repository/commons-lang/commons-lang/2.3/commons-lang-2.3.jar
urls[3] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-api/1.3.1/enforcer-api-1.3.1.jar
urls[4] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-rules/1.3.1/enforcer-rules-1.3.1.jar
urls[5] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-common-artifact-filters/1.4/maven-common-artifact-filters-1.4.jar
urls[6] = file:/home/ubuntu/.m2/repository/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar
urls[7] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-dependency-tree/2.1/maven-dependency-tree-2.1.jar
urls[8] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
urls[9] = file:/home/ubuntu/.m2/repository/org/eclipse/aether/aether-util/0.9.0.M2/aether-util-0.9.0.M2.jar
urls[10] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-i18n/1.0-beta-6/plexus-i18n-1.0-beta-6.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>


this realm = plexus.core
urls[0] = file:/usr/share/maven2/lib/maven-debian-uber.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>
-----------------------------------------------------
[INFO] [remote-resources:process {execution: default}]
[INFO] [resources:resources {execution: default-resources}]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/aggregator/src/main/resources
[INFO] Copying 3 resources
[INFO] [antrun:run {execution: define-classpath}]
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] [compiler:compile {execution: default-compile}]
[INFO] No sources to compile
[INFO] [resources:testResources {execution: default-testResources}]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/aggregator/src/test/resources
[INFO] Copying 3 resources
[INFO] [antrun:run {execution: setup-test-dirs}]
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/aggregator/target/tmp
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/aggregator/target/warehouse
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/aggregator/target/tmp/conf
     [copy] Copying 15 files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/aggregator/target/tmp/conf
[INFO] Executed tasks
[INFO] [compiler:testCompile {execution: default-testCompile}]
[INFO] No sources to compile
[INFO] [surefire:test {execution: default-test}]
[INFO] Tests are skipped.
[INFO] [jar:jar {execution: default-jar}]
[INFO] Building jar: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/shims/aggregator/target/hive-shims-2.1.3-inm-fix.jar
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Storage API
[INFO]    task-segment: [clean, package]
[INFO] ------------------------------------------------------------------------
[INFO] [clean:clean {execution: default-clean}]
[INFO] Deleting /home/ubuntu/hive-hive-release-2.1.3-inm-fix/storage-api (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] [enforcer:enforce {execution: enforce-no-snapshots}]
-----------------------------------------------------
this realm = app0.child-container[org.apache.maven.plugins:maven-enforcer-plugin:1.3.1]
urls[0] = file:/home/ubuntu/.m2/repository/org/apache/maven/plugins/maven-enforcer-plugin/1.3.1/maven-enforcer-plugin-1.3.1.jar
urls[1] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-utils/1.5.8/plexus-utils-1.5.8.jar
urls[2] = file:/home/ubuntu/.m2/repository/commons-lang/commons-lang/2.3/commons-lang-2.3.jar
urls[3] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-api/1.3.1/enforcer-api-1.3.1.jar
urls[4] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-rules/1.3.1/enforcer-rules-1.3.1.jar
urls[5] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-common-artifact-filters/1.4/maven-common-artifact-filters-1.4.jar
urls[6] = file:/home/ubuntu/.m2/repository/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar
urls[7] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-dependency-tree/2.1/maven-dependency-tree-2.1.jar
urls[8] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
urls[9] = file:/home/ubuntu/.m2/repository/org/eclipse/aether/aether-util/0.9.0.M2/aether-util-0.9.0.M2.jar
urls[10] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-i18n/1.0-beta-6/plexus-i18n-1.0-beta-6.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>


this realm = plexus.core
urls[0] = file:/usr/share/maven2/lib/maven-debian-uber.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>
-----------------------------------------------------
[INFO] [remote-resources:process {execution: default}]
[INFO] [resources:resources {execution: default-resources}]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/ubuntu/hive-hive-release-2.1.3-inm-fix/storage-api/src/main/resources
[INFO] Copying 3 resources
[INFO] [antrun:run {execution: define-classpath}]
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] [compiler:compile {execution: default-compile}]
[INFO] Compiling 37 source files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/storage-api/target/classes
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/storage-api/src/java/org/apache/hadoop/hive/common/type/HiveIntervalDayTime.java:[30,25] sun.util.calendar.BaseCalendar is internal proprietary API and may be removed in a future release
[INFO] [resources:testResources {execution: default-testResources}]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/ubuntu/hive-hive-release-2.1.3-inm-fix/storage-api/src/test/resources
[INFO] Copying 3 resources
[INFO] [antrun:run {execution: setup-test-dirs}]
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/storage-api/target/tmp
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/storage-api/target/warehouse
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/storage-api/target/tmp/conf
     [copy] Copying 15 files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/storage-api/target/tmp/conf
[INFO] Executed tasks
[INFO] [compiler:testCompile {execution: default-testCompile}]
[INFO] Compiling 7 source files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/storage-api/target/test-classes
[INFO] [surefire:test {execution: default-test}]
[INFO] Tests are skipped.
[INFO] [jar:jar {execution: default-jar}]
[INFO] Building jar: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/storage-api/target/hive-storage-api-2.1.3-inm-fix.jar
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive ORC
[INFO]    task-segment: [clean, package]
[INFO] ------------------------------------------------------------------------
[INFO] [clean:clean {execution: default-clean}]
[INFO] Deleting /home/ubuntu/hive-hive-release-2.1.3-inm-fix/orc (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] [enforcer:enforce {execution: enforce-no-snapshots}]
-----------------------------------------------------
this realm = app0.child-container[org.apache.maven.plugins:maven-enforcer-plugin:1.3.1]
urls[0] = file:/home/ubuntu/.m2/repository/org/apache/maven/plugins/maven-enforcer-plugin/1.3.1/maven-enforcer-plugin-1.3.1.jar
urls[1] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-utils/1.5.8/plexus-utils-1.5.8.jar
urls[2] = file:/home/ubuntu/.m2/repository/commons-lang/commons-lang/2.3/commons-lang-2.3.jar
urls[3] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-api/1.3.1/enforcer-api-1.3.1.jar
urls[4] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-rules/1.3.1/enforcer-rules-1.3.1.jar
urls[5] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-common-artifact-filters/1.4/maven-common-artifact-filters-1.4.jar
urls[6] = file:/home/ubuntu/.m2/repository/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar
urls[7] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-dependency-tree/2.1/maven-dependency-tree-2.1.jar
urls[8] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
urls[9] = file:/home/ubuntu/.m2/repository/org/eclipse/aether/aether-util/0.9.0.M2/aether-util-0.9.0.M2.jar
urls[10] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-i18n/1.0-beta-6/plexus-i18n-1.0-beta-6.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>


this realm = plexus.core
urls[0] = file:/usr/share/maven2/lib/maven-debian-uber.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>
-----------------------------------------------------
[INFO] [build-helper:add-source {execution: add-source}]
[INFO] Source directory: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/orc/src/gen/protobuf-java added.
[INFO] [remote-resources:process {execution: default}]
[INFO] [resources:resources {execution: default-resources}]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/ubuntu/hive-hive-release-2.1.3-inm-fix/orc/src/main/resources
[INFO] Copying 3 resources
[INFO] [antrun:run {execution: define-classpath}]
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] [compiler:compile {execution: default-compile}]
[INFO] Compiling 71 source files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/orc/target/classes
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/orc/src/java/org/apache/orc/impl/ReaderImpl.java: Some input files use or override a deprecated API.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/orc/src/java/org/apache/orc/impl/ReaderImpl.java: Recompile with -Xlint:deprecation for details.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/orc/src/java/org/apache/orc/impl/RecordReaderImpl.java: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/orc/src/java/org/apache/orc/impl/RecordReaderImpl.java uses unchecked or unsafe operations.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/orc/src/java/org/apache/orc/impl/RecordReaderImpl.java: Recompile with -Xlint:unchecked for details.
[INFO] [resources:testResources {execution: default-testResources}]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 7 resources
[INFO] Copying 3 resources
[INFO] [antrun:run {execution: setup-test-dirs}]
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/orc/target/tmp
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/orc/target/warehouse
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/orc/target/tmp/conf
     [copy] Copying 15 files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/orc/target/tmp/conf
[INFO] Executed tasks
[INFO] [compiler:testCompile {execution: default-testCompile}]
[INFO] Compiling 31 source files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/orc/target/test-classes
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/orc/src/test/org/apache/orc/TestOrcTimezone1.java: Some input files use or override a deprecated API.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/orc/src/test/org/apache/orc/TestOrcTimezone1.java: Recompile with -Xlint:deprecation for details.
[INFO] [surefire:test {execution: default-test}]
[INFO] Tests are skipped.
[INFO] [jar:jar {execution: default-jar}]
[INFO] Building jar: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/orc/target/hive-orc-2.1.3-inm-fix.jar
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Common
[INFO]    task-segment: [clean, package]
[INFO] ------------------------------------------------------------------------
[INFO] [clean:clean {execution: default-clean}]
[INFO] Deleting /home/ubuntu/hive-hive-release-2.1.3-inm-fix/common (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] [enforcer:enforce {execution: enforce-no-snapshots}]
-----------------------------------------------------
this realm = app0.child-container[org.apache.maven.plugins:maven-enforcer-plugin:1.3.1]
urls[0] = file:/home/ubuntu/.m2/repository/org/apache/maven/plugins/maven-enforcer-plugin/1.3.1/maven-enforcer-plugin-1.3.1.jar
urls[1] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-utils/1.5.8/plexus-utils-1.5.8.jar
urls[2] = file:/home/ubuntu/.m2/repository/commons-lang/commons-lang/2.3/commons-lang-2.3.jar
urls[3] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-api/1.3.1/enforcer-api-1.3.1.jar
urls[4] = file:/home/ubuntu/.m2/repository/org/apache/maven/enforcer/enforcer-rules/1.3.1/enforcer-rules-1.3.1.jar
urls[5] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-common-artifact-filters/1.4/maven-common-artifact-filters-1.4.jar
urls[6] = file:/home/ubuntu/.m2/repository/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar
urls[7] = file:/home/ubuntu/.m2/repository/org/apache/maven/shared/maven-dependency-tree/2.1/maven-dependency-tree-2.1.jar
urls[8] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
urls[9] = file:/home/ubuntu/.m2/repository/org/eclipse/aether/aether-util/0.9.0.M2/aether-util-0.9.0.M2.jar
urls[10] = file:/home/ubuntu/.m2/repository/org/codehaus/plexus/plexus-i18n/1.0-beta-6/plexus-i18n-1.0-beta-6.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>


this realm = plexus.core
urls[0] = file:/usr/share/maven2/lib/maven-debian-uber.jar
Number of imports: 10
import: org.codehaus.classworlds.Entry@a6c57a42<mailto:org.codehaus.classworlds.Entry@a6c57a42>
import: org.codehaus.classworlds.Entry@12f43f3b<mailto:org.codehaus.classworlds.Entry@12f43f3b>
import: org.codehaus.classworlds.Entry@20025374<mailto:org.codehaus.classworlds.Entry@20025374>
import: org.codehaus.classworlds.Entry@f8e44ca4<mailto:org.codehaus.classworlds.Entry@f8e44ca4>
import: org.codehaus.classworlds.Entry@92758522<mailto:org.codehaus.classworlds.Entry@92758522>
import: org.codehaus.classworlds.Entry@ebf2705b<mailto:org.codehaus.classworlds.Entry@ebf2705b>
import: org.codehaus.classworlds.Entry@bb25e54<mailto:org.codehaus.classworlds.Entry@bb25e54>
import: org.codehaus.classworlds.Entry@bece5185<mailto:org.codehaus.classworlds.Entry@bece5185>
import: org.codehaus.classworlds.Entry@3fee8e37<mailto:org.codehaus.classworlds.Entry@3fee8e37>
import: org.codehaus.classworlds.Entry@3fee19d8<mailto:org.codehaus.classworlds.Entry@3fee19d8>
-----------------------------------------------------
[INFO] [antrun:run {execution: generate-version-annotation}]
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] [build-helper:add-source {execution: add-source}]
[INFO] Source directory: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/src/gen added.
[INFO] [remote-resources:process {execution: default}]
[INFO] [resources:resources {execution: default-resources}]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 2 resources
[INFO] Copying 3 resources
[INFO] [antrun:run {execution: define-classpath}]
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] [compiler:compile {execution: default-compile}]
[INFO] Compiling 85 source files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/target/classes
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/src/java/org/apache/hadoop/hive/common/JvmPauseMonitor.java: Some input files use or override a deprecated API.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/src/java/org/apache/hadoop/hive/common/JvmPauseMonitor.java: Recompile with -Xlint:deprecation for details.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/src/java/org/apache/hadoop/hive/common/ObjectPair.java: Some input files use unchecked or unsafe operations.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/src/java/org/apache/hadoop/hive/common/ObjectPair.java: Recompile with -Xlint:unchecked for details.
[INFO] [resources:testResources {execution: default-testResources}]
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 3 resources
[INFO] Copying 3 resources
[INFO] [antrun:run {execution: setup-test-dirs}]
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/target/tmp
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/target/warehouse
    [mkdir] Created dir: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/target/tmp/conf
     [copy] Copying 15 files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/target/tmp/conf
[INFO] Executed tasks
[INFO] [compiler:testCompile {execution: default-testCompile}]
[INFO] Compiling 25 source files to /home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/target/test-classes
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/src/test/org/apache/hadoop/hive/conf/TestVariableSubstitution.java: Some input files use or override a deprecated API.
[WARNING] /home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/src/test/org/apache/hadoop/hive/conf/TestVariableSubstitution.java: Recompile with -Xlint:deprecation for details.
[INFO] [surefire:test {execution: default-test}]
[INFO] Tests are skipped.
[INFO] [jar:jar {execution: default-jar}]
[INFO] Building jar: /home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/target/hive-common-2.1.3-inm-fix.jar
[INFO] [antrun:run {execution: generate-template}]
[INFO] Executing tasks

main:
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
log4j:WARN No appenders could be found for logger (org.apache.hadoop.util.Shell).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
[INFO] ------------------------------------------------------------------------
[ERROR] BUILD ERROR
[INFO] ------------------------------------------------------------------------
[INFO] An Ant BuildException has occured: java.lang.NoClassDefFoundError: org/apache/commons/collections/map/UnmodifiableMap
around Ant part ...<templategen templateFile="/home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/../conf/hive-default.xml.template"/>... @ 6:118 in /home/ubuntu/hive-hive-release-2.1.3-inm-fix/common/target/antrun/build-main.xml

org.apache.commons.collections.map.UnmodifiableMap
[INFO] ------------------------------------------------------------------------
[INFO] For more information, run Maven with the -e switch
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 15 seconds
[INFO] Finished at: Wed Sep 14 06:19:09 UTC 2016
[INFO] Final Memory: 121M/1394M
[INFO] ------------------------------------------------------------------------


Further inputs:

Cloudera distribution does not keep the standard folder  structures for Hadoop. They also do not ship the HADOOP_CLASSPATH and HADOOP_HOME variables.  I had set the HADOOP_CLASSPATH manually to  the following:
export HADOOP_CLASSPATH=/home/ubuntu/commons-collections4-4.1/commons-collections4-4.1.jar:/usr/lib/jvm/java-7-oracle-cloudera/lib/tools.jar:hadoop-core-2.6.0-mr1-cdh5.8.0.jar::/opt/cloudera/parcels/CDH/jars/guava-11.0.2.jar:/opt/cloudera/parcels/CDH/jars/guava-11.0.jar:/opt/cloudera/parcels/CDH/jars/guava-12.0.1.jar:/opt/cloudera/parcels/CDH/jars/guava-14.0.1.jar:/opt/cloudera/parcels/CDH/jars/guava-14.0.jar:/opt/cloudera/parcels/CDH/jars/guava-15.0.jar::/opt/cloudera/parcels/CDH/jars/commons-configuration-1.6.jar:/opt/cloudera/parcels/CDH/jars/commons-configuration-1.7.jar::/opt/cloudera/parcels/CDH/jars/protobuf-java-2.4.1-shaded.jar:/opt/cloudera/parcels/CDH/jars/protobuf-java-2.5.0.jar::/opt/cloudera/parcels/CDH/jars/htrace-core-3.0.4.jar:/opt/cloudera/parcels/CDH/jars/htrace-core-3.2.0-incubating.jar::/opt/cloudera/parcels/CDH/jars/commons-configuration-1.6.jar:/opt/cloudera/parcels/CDH/jars/commons-configuration-1.7.jar::/opt/cloudera/parcels/CDH/jars/protobuf-java-2.4.1-shaded.jar:/opt/cloudera/parcels/CDH/jars/protobuf-java-2.5.0.jar::/opt/cloudera/parcels/CDH/jars/hadoop-annotations-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-ant-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-ant-2.6.0-mr1-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-archive-logs-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-archives-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-auth-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-aws-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-azure-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-capacity-scheduler-2.6.0-mr1-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-common-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-common-2.6.0-cdh5.8.0-tests.jar:/opt/cloudera/parcels/CDH/jars/hadoop-core-2.6.0-mr1-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-datajoin-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-distcp-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-examples-2.6.0-mr1-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-examples.jar:/opt/cloudera/parcels/CDH/jars/hadoop-extras-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-fairscheduler-2.6.0-mr1-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-gridmix-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-gridmix-2.6.0-mr1-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-hdfs-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-hdfs-2.6.0-cdh5.8.0-tests.jar:/opt/cloudera/parcels/CDH/jars/hadoop-hdfs-nfs-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-kms-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-app-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-common-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-core-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-hs-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-hs-plugins-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.8.0-tests.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-nativetask-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-shuffle-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-examples-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-nfs-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-openstack-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-rumen-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-sls-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-streaming-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-streaming-2.6.0-mr1-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-test-2.6.0-mr1-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-tools-2.6.0-mr1-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-api-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-applications-distributedshell-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-client-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-common-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-registry-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-server-applicationhistoryservice-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-server-common-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-server-nodemanager-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-server-resourcemanager-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-server-tests-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-server-web-proxy-2.6.0-cdh5.8.0.jar::/opt/cloudera/parcels/CDH/jars/hadoop-common-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-common-2.6.0-cdh5.8.0-tests.jar::/opt/cloudera/parcels/CDH/jars/hadoop-annotations-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-ant-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-ant-2.6.0-mr1-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-archive-logs-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-archives-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-auth-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-aws-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-azure-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-capacity-scheduler-2.6.0-mr1-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-common-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-common-2.6.0-cdh5.8.0-tests.jar:/opt/cloudera/parcels/CDH/jars/hadoop-core-2.6.0-mr1-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-datajoin-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-distcp-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-examples-2.6.0-mr1-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-examples.jar:/opt/cloudera/parcels/CDH/jars/hadoop-extras-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-fairscheduler-2.6.0-mr1-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-gridmix-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-gridmix-2.6.0-mr1-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-hdfs-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-hdfs-2.6.0-cdh5.8.0-tests.jar:/opt/cloudera/parcels/CDH/jars/hadoop-hdfs-nfs-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-kms-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-app-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-common-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-core-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-hs-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-hs-plugins-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.8.0-tests.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-nativetask-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-shuffle-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-examples-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-nfs-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-openstack-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-rumen-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-sls-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-streaming-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-streaming-2.6.0-mr1-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-test-2.6.0-mr1-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-tools-2.6.0-mr1-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-api-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-applications-distributedshell-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-client-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-common-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-registry-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-server-applicationhistoryservice-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-server-common-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-server-nodemanager-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-server-resourcemanager-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-server-tests-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/jars/hadoop-yarn-server-web-proxy-2.6.0-cdh5.8.0.jar::/opt/cloudera/parcels/CDH/lib/hadoop-hdfs/hadoop-hdfs-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-hdfs/hadoop-hdfs-2.6.0-cdh5.8.0-tests.jar:/opt/cloudera/parcels/CDH/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/opt/cloudera/parcels/CDH/lib/hadoop-hdfs/hadoop-hdfs-tests.jar:::/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-ant-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-ant.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-archive-logs-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-archive-logs.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-archives-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-archives.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-auth-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-auth.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-azure-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-azure.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-datajoin-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-datajoin.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-distcp-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-distcp.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-extras-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-extras.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-gridmix-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-gridmix.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.8.0-tests.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-tests.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-nativetask-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-nativetask.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-openstack-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-openstack.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-rumen-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-rumen.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-sls-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-sls.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-streaming-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-streaming.jar:::/opt/cloudera/parcels/CDH/lib/hadoop/hadoop-annotations-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop/hadoop-annotations.jar:/opt/cloudera/parcels/CDH/lib/hadoop/hadoop-auth-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop/hadoop-auth.jar:/opt/cloudera/parcels/CDH/lib/hadoop/hadoop-aws-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop/hadoop-aws.jar:/opt/cloudera/parcels/CDH/lib/hadoop/hadoop-common-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop/hadoop-common-2.6.0-cdh5.8.0-tests.jar:/opt/cloudera/parcels/CDH/lib/hadoop/hadoop-common.jar:/opt/cloudera/parcels/CDH/lib/hadoop/hadoop-common-tests.jar:/opt/cloudera/parcels/CDH/lib/hadoop/hadoop-nfs-2.6.0-cdh5.8.0.jar:/opt/cloudera/parcels/CDH/lib/hadoop/hadoop-nfs.jar


Any help will be greatly appreciated.

Thanks and regards
Sayantan Raha

This e-mail and any files transmitted with it are for the sole use of the intended recipient(s) and may contain confidential and privileged information. If you are not the intended recipient(s), please reply to the sender and destroy all copies of the original message. Any unauthorized review, use, disclosure, dissemination, forwarding, printing or copying of this email, and/or any action taken in reliance on the contents of this e-mail is strictly prohibited and may be unlawful. Where permitted by applicable law, this e-mail and other e-mail communications sent to and from Cognizant e-mail addresses may be monitored.
This e-mail and any files transmitted with it are for the sole use of the intended recipient(s) and may contain confidential and privileged information. If you are not the intended recipient(s), please reply to the sender and destroy all copies of the original message. Any unauthorized review, use, disclosure, dissemination, forwarding, printing or copying of this email, and/or any action taken in reliance on the contents of this e-mail is strictly prohibited and may be unlawful. Where permitted by applicable law, this e-mail and other e-mail communications sent to and from Cognizant e-mail addresses may be monitored.


_____________________________________________________________
The information contained in this communication is intended solely for the use of the individual or entity to whom it is addressed and others authorized to receive it. It may contain confidential or legally privileged information. If you are not the intended recipient you are hereby notified that any disclosure, copying, distribution or taking any action in reliance on the contents of this information is strictly prohibited and may be unlawful. If you have received this communication in error, please notify us immediately by responding to this email and then delete it from your system. The firm is neither liable for the proper and complete transmission of the information contained in this communication nor for any delay in its receipt.

This e-mail and any files transmitted with it are for the sole use of the intended recipient(s) and may contain confidential and privileged information. If you are not the intended recipient(s), please reply to the sender and destroy all copies of the original message. Any unauthorized review, use, disclosure, dissemination, forwarding, printing or copying of this email, and/or any action taken in reliance on the contents of this e-mail is strictly prohibited and may be unlawful. Where permitted by applicable law, this e-mail and other e-mail communications sent to and from Cognizant e-mail addresses may be monitored.

This e-mail and any files transmitted with it are for the sole use of the intended recipient(s) and may contain confidential and privileged information. If you are not the intended recipient(s), please reply to the sender and destroy all copies of the original message. Any unauthorized review, use, disclosure, dissemination, forwarding, printing or copying of this email, and/or any action taken in reliance on the contents of this e-mail is strictly prohibited and may be unlawful. Where permitted by applicable law, this e-mail and other e-mail communications sent to and from Cognizant e-mail addresses may be monitored.
Mime
View raw message