carbondata-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: CarbonData-master #415
Date Thu, 23 Mar 2017 10:02:07 GMT
See <https://builds.apache.org/job/CarbonData-master/415/display/redirect?page=changes>

Changes:

[ramana.gollamudi] If table block size if specified for its max value 2048, then while

------------------------------------------
[...truncated 242.70 KB...]
[INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ carbondata-spark-common
---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/src/resources>
[INFO] Copying 0 resource
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-scala-plugin:2.15.2:compile (default) @ carbondata-spark-common ---
[INFO] Checking for multiple versions of scala
[INFO] includes = [**/*.java,**/*.scala,]
[INFO] excludes = []
[INFO] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/src/main/java>:-1:
info: compiling
[INFO] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/src/main/scala>:-1:
info: compiling
[INFO] Compiling 66 source files to <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/target/classes>
at 1490261252627
[WARNING] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/src/main/scala/org/apache/carbondata/spark/rdd/UpdateCoalescedRDD.scala>:23:
warning: imported `CoalescedRDDPartition' is permanently hidden by definition of object CoalescedRDDPartition
in package rdd
[INFO] import org.apache.spark.rdd.{CoalescedRDDPartition, DataLoadPartitionCoalescer, RDD}
[INFO]                              ^
[WARNING] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/src/main/scala/org/apache/carbondata/spark/rdd/UpdateCoalescedRDD.scala>:23:
warning: imported `DataLoadPartitionCoalescer' is permanently hidden by definition of object
DataLoadPartitionCoalescer in package rdd
[INFO] import org.apache.spark.rdd.{CoalescedRDDPartition, DataLoadPartitionCoalescer, RDD}
[INFO]                                                     ^
[WARNING] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/src/main/scala/org/apache/carbondata/spark/rdd/UpdateCoalescedRDD.scala>:23:
warning: imported `RDD' is permanently hidden by definition of object RDD in package rdd
[INFO] import org.apache.spark.rdd.{CoalescedRDDPartition, DataLoadPartitionCoalescer, RDD}
[INFO]                                                                                 ^
[WARNING] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/CarbonScalaUtil.scala>:181:
warning: non-variable type argument Any in type pattern scala.collection.Map[Any,Any] is unchecked
since it is eliminated by erasure
[INFO]         case m: scala.collection.Map[Any, Any] =>
[INFO]                                  ^
[WARNING] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/src/main/scala/org/apache/carbondata/spark/rdd/DataLoadPartitionCoalescer.scala>:193:
warning: match may not be exhaustive.
[INFO] It would fail on the following input: None
[INFO]                 hostMapPartitionIds.get(loc) match {
[INFO]                                        ^
[WARNING] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/src/main/scala/org/apache/carbondata/spark/rdd/DataLoadPartitionCoalescer.scala>:190:
warning: match may not be exhaustive.
[INFO] It would fail on the following input: None
[INFO]           partitionIdMapHosts.get(partitionId) match {
[INFO]                                  ^
[WARNING] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/CarbonScalaUtil.scala>:72:
warning: match may not be exhaustive.
[INFO] It would fail on the following inputs: ARRAY, FLOAT, MAP, NULL, STRUCT
[INFO]     dataType match {
[INFO]     ^
[WARNING] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/DataTypeConverterUtil.scala>:72:
warning: match may not be exhaustive.
[INFO] It would fail on the following inputs: BOOLEAN, MAP, NULL
[INFO]     dataType match {
[INFO]     ^
[WARNING] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/src/main/scala/org/apache/spark/sql/optimizer/CarbonDecoderOptimizerHelper.scala>:76:
warning: unreachable code
[INFO]       case u: Union =>
[INFO]                     ^
[WARNING] warning: there were 2 deprecation warning(s); re-run with -deprecation for details
[WARNING] warning: there were 3 feature warning(s); re-run with -feature for details
[WARNING] 11 warnings found
[INFO] prepare-compile in 0 s
[INFO] compile in 42 s
[INFO] 
[INFO] --- maven-compiler-plugin:3.2:compile (default-compile) @ carbondata-spark-common ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 24 source files to <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/target/classes>
[INFO] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/src/main/java/org/apache/carbondata/spark/merger/CarbonDataMergerUtil.java>:
<https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/src/main/java/org/apache/carbondata/spark/merger/CarbonDataMergerUtil.java>
uses or overrides a deprecated API.
[INFO] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/src/main/java/org/apache/carbondata/spark/merger/CarbonDataMergerUtil.java>:
Recompile with -Xlint:deprecation for details.
[INFO] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/src/main/java/org/apache/carbondata/spark/merger/CarbonCompactionExecutor.java>:
Some input files use unchecked or unsafe operations.
[INFO] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/src/main/java/org/apache/carbondata/spark/merger/CarbonCompactionExecutor.java>:
Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- maven-scala-plugin:2.15.2:compile (compile) @ carbondata-spark-common ---
[INFO] Checking for multiple versions of scala
[INFO] includes = [**/*.java,**/*.scala,]
[INFO] excludes = []
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-resources-plugin:2.7:testResources (default-testResources) @ carbondata-spark-common
---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/src/test/resources>
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.2:testCompile (default-testCompile) @ carbondata-spark-common
---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.18.1:test (default-test) @ carbondata-spark-common ---
[JENKINS] Recording test results[INFO] 

[INFO] --- maven-scala-plugin:2.15.2:testCompile (testCompile) @ carbondata-spark-common ---
[INFO] Checking for multiple versions of scala
[INFO] includes = [**/*.java,**/*.scala,]
[INFO] excludes = []
[WARNING] No source files found.
[INFO] 
[INFO] --- maven-jar-plugin:2.5:jar (default-jar) @ carbondata-spark-common ---
[INFO] Building jar: <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/target/carbondata-spark-common-1.1.0-incubating-SNAPSHOT.jar>
[INFO] 
[INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ carbondata-spark-common
---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.17:check (default) @ carbondata-spark-common ---
[INFO] Starting audit...
Audit done.
[INFO] 
[INFO] --- scalastyle-maven-plugin:0.8.0:check (default) @ carbondata-spark-common ---
Saving to outputFile=<https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/target/scalastyle-output.xml>
Processed 42 file(s)
Found 0 errors
Found 0 warnings
Found 0 infos
Finished in 5067 ms
[INFO] 
[INFO] --- maven-install-plugin:2.5.2:install (default-install) @ carbondata-spark-common
---
[INFO] Installing <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/target/carbondata-spark-common-1.1.0-incubating-SNAPSHOT.jar>
to /home/jenkins/jenkins-slave/maven-repositories/1/org/apache/carbondata/carbondata-spark-common/1.1.0-incubating-SNAPSHOT/carbondata-spark-common-1.1.0-incubating-SNAPSHOT.jar
[INFO] Installing <https://builds.apache.org/job/CarbonData-master/ws/integration/spark-common/pom.xml>
to /home/jenkins/jenkins-slave/maven-repositories/1/org/apache/carbondata/carbondata-spark-common/1.1.0-incubating-SNAPSHOT/carbondata-spark-common-1.1.0-incubating-SNAPSHOT.pom
[INFO] 
[INFO] --- maven-deploy-plugin:2.8.2:deploy (default-deploy) @ carbondata-spark-common ---
[INFO] Downloading: https://repository.apache.org/content/repositories/snapshots/org/apache/carbondata/carbondata-spark-common/1.1.0-incubating-SNAPSHOT/maven-metadata.xml
[INFO] Downloaded: https://repository.apache.org/content/repositories/snapshots/org/apache/carbondata/carbondata-spark-common/1.1.0-incubating-SNAPSHOT/maven-metadata.xml
(830 B at 9.8 KB/sec)
[INFO] Uploading: https://repository.apache.org/content/repositories/snapshots/org/apache/carbondata/carbondata-spark-common/1.1.0-incubating-SNAPSHOT/carbondata-spark-common-1.1.0-incubating-20170323.092823-32.jar
[INFO] Uploaded: https://repository.apache.org/content/repositories/snapshots/org/apache/carbondata/carbondata-spark-common/1.1.0-incubating-SNAPSHOT/carbondata-spark-common-1.1.0-incubating-20170323.092823-32.jar
(1259 KB at 5773.1 KB/sec)
[INFO] Uploading: https://repository.apache.org/content/repositories/snapshots/org/apache/carbondata/carbondata-spark-common/1.1.0-incubating-SNAPSHOT/carbondata-spark-common-1.1.0-incubating-20170323.092823-32.pom
[INFO] Uploaded: https://repository.apache.org/content/repositories/snapshots/org/apache/carbondata/carbondata-spark-common/1.1.0-incubating-SNAPSHOT/carbondata-spark-common-1.1.0-incubating-20170323.092823-32.pom
(5 KB at 34.8 KB/sec)
[INFO] Downloading: https://repository.apache.org/content/repositories/snapshots/org/apache/carbondata/carbondata-spark-common/maven-metadata.xml
[INFO] Downloaded: https://repository.apache.org/content/repositories/snapshots/org/apache/carbondata/carbondata-spark-common/maven-metadata.xml
(437 B at 4.8 KB/sec)
[INFO] Uploading: https://repository.apache.org/content/repositories/snapshots/org/apache/carbondata/carbondata-spark-common/1.1.0-incubating-SNAPSHOT/maven-metadata.xml
[INFO] Uploaded: https://repository.apache.org/content/repositories/snapshots/org/apache/carbondata/carbondata-spark-common/1.1.0-incubating-SNAPSHOT/maven-metadata.xml
(830 B at 10.3 KB/sec)
[INFO] Uploading: https://repository.apache.org/content/repositories/snapshots/org/apache/carbondata/carbondata-spark-common/maven-metadata.xml
[INFO] Uploaded: https://repository.apache.org/content/repositories/snapshots/org/apache/carbondata/carbondata-spark-common/maven-metadata.xml
(437 B at 5.7 KB/sec)
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache CarbonData :: Spark 1.1.0-incubating-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ carbondata-spark ---
[INFO] Deleting <https://builds.apache.org/job/CarbonData-master/ws/integration/spark/target>
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ carbondata-spark ---
[INFO] 
[INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ carbondata-spark ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 2 resources
[INFO] Copying 1 resource
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-scala-plugin:2.15.2:compile (default) @ carbondata-spark ---
[INFO] Checking for multiple versions of scala
[INFO] includes = [**/*.java,**/*.scala,]
[INFO] excludes = []
[INFO] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark/src/main/java>:-1:
info: compiling
[INFO] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark/src/main/scala>:-1:
info: compiling
[INFO] Compiling 34 source files to <https://builds.apache.org/job/CarbonData-master/ws/integration/spark/target/classes>
at 1490261311218
[WARNING] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark/src/main/scala/org/apache/spark/sql/CarbonDictionaryDecoder.scala>:98:
warning: match may not be exhaustive.
[INFO] It would fail on the following inputs: FLOAT, MAP, NULL
[INFO]     carbonDimension.getDataType match {
[INFO]                     ^
[WARNING] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark/src/main/scala/org/apache/carbondata/spark/rdd/CarbonDataRDDFactory.scala>:83:
warning: org.apache.carbondata.core.statusmanager.SegmentUpdateStatusManager and None.type
are unrelated: they will most likely always compare unequal
[INFO]       if (alterTableModel.segmentUpdateStatusManager.get != None) {
[INFO]                                                          ^
[WARNING] warning: there were 7 deprecation warning(s); re-run with -deprecation for details
[WARNING] warning: there were 1 feature warning(s); re-run with -feature for details
[WARNING] four warnings found
[INFO] prepare-compile in 0 s
[INFO] compile in 56 s
[INFO] 
[INFO] --- maven-compiler-plugin:3.2:compile (default-compile) @ carbondata-spark ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 1 source file to <https://builds.apache.org/job/CarbonData-master/ws/integration/spark/target/classes>
[INFO] 
[INFO] --- maven-scala-plugin:2.15.2:compile (compile) @ carbondata-spark ---
[INFO] Checking for multiple versions of scala
[INFO] includes = [**/*.java,**/*.scala,]
[INFO] excludes = []
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-resources-plugin:2.7:testResources (default-testResources) @ carbondata-spark
---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.2:testCompile (default-testCompile) @ carbondata-spark
---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-surefire-plugin:2.18:test (default-test) @ carbondata-spark ---
[JENKINS] Recording test results[INFO] 

[INFO] --- maven-scala-plugin:2.15.2:testCompile (testCompile) @ carbondata-spark ---
[INFO] Checking for multiple versions of scala
[INFO] includes = [**/*.java,**/*.scala,]
[INFO] excludes = []
[INFO] <https://builds.apache.org/job/CarbonData-master/ws/integration/spark/src/test/scala>:-1:
info: compiling
[INFO] Compiling 29 source files to <https://builds.apache.org/job/CarbonData-master/ws/integration/spark/target/test-classes>
at 1490261371042
[INFO] prepare-compile in 0 s
[INFO] compile in 43 s
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ carbondata-spark ---
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed
in 8.0
Discovery starting.
INFO  23-03 09:30:18,622 - ScalaTest-main project path: <https://builds.apache.org/job/CarbonData-master/ws/>
INFO  23-03 09:30:18,759 - ScalaTest-main Property file path: <https://builds.apache.org/job/CarbonData-master/ws/integration/spark/../../../conf/carbon.properties>
INFO  23-03 09:30:18,770 - ScalaTest-main ------Using Carbon.properties --------
INFO  23-03 09:30:18,770 - ScalaTest-main {}
INFO  23-03 09:30:18,770 - ScalaTest-main Carbon Current data file version: V3
INFO  23-03 09:30:18,770 - ScalaTest-main Executor start up wait time: 5
INFO  23-03 09:30:18,771 - ScalaTest-main Blocklet Size Configured value is "64
INFO  23-03 09:30:18,771 - ScalaTest-main use TestQueryExecutorImplV1
INFO  23-03 09:30:18,931 - Running Spark version 1.6.2
WARN  23-03 09:30:19,519 - Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable
INFO  23-03 09:30:19,784 - Changing view acls to: jenkins
INFO  23-03 09:30:19,788 - Changing modify acls to: jenkins
INFO  23-03 09:30:19,789 - SecurityManager: authentication disabled; ui acls disabled; users
with view permissions: Set(jenkins); users with modify permissions: Set(jenkins)
WARN  23-03 09:30:21,890 - Failed to find a usable hardware address from the network interfaces;
using random bytes: 45:0d:8f:06:28:81:61:c9
INFO  23-03 09:30:21,983 - Successfully started service 'sparkDriver' on port 44221.
INFO  23-03 09:30:23,554 - Slf4jLogger started
INFO  23-03 09:30:23,793 - Starting remoting
INFO  23-03 09:30:24,593 - Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@10.10.3.184:33033]
INFO  23-03 09:30:24,609 - Successfully started service 'sparkDriverActorSystem' on port 33033.
INFO  23-03 09:30:24,647 - Registering MapOutputTracker
INFO  23-03 09:30:24,689 - Registering BlockManagerMaster
INFO  23-03 09:30:24,770 - Created local directory at /tmp/blockmgr-ae86d176-db00-4fe0-9e2f-479c6a259d45
INFO  23-03 09:30:24,800 - MemoryStore started with capacity 1823.3 MB
INFO  23-03 09:30:24,977 - Registering OutputCommitCoordinator
INFO  23-03 09:30:25,722 - jetty-8.y.z-SNAPSHOT
INFO  23-03 09:30:25,959 - Started SelectChannelConnector@0.0.0.0:4040
INFO  23-03 09:30:25,960 - Successfully started service 'SparkUI' on port 4040.
INFO  23-03 09:30:25,967 - Started SparkUI at http://10.10.3.184:4040
INFO  23-03 09:30:26,272 - Starting executor ID driver on host localhost
INFO  23-03 09:30:26,360 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService'
on port 46890.
INFO  23-03 09:30:26,361 - Server created on 46890
INFO  23-03 09:30:26,362 - Trying to register BlockManager
INFO  23-03 09:30:26,374 - Registering block manager localhost:46890 with 1823.3 MB RAM, BlockManagerId(driver,
localhost, 46890)
INFO  23-03 09:30:26,381 - Registered BlockManager
Java HotSpot(TM) 64-Bit Server VM warning: INFO: os::commit_memory(0x000000079ca80000, 94896128,
0) failed; error='Cannot allocate memory' (errno=12)
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Native memory allocation (mmap) failed to map 94896128 bytes for committing reserved memory.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/CarbonData-master/ws/integration/spark/hs_err_pid27276.log>
ERROR: Failed to parse POMs
java.io.IOException: Backing channel 'ubuntu-us1' is disconnected.
	at hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:191)
	at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:256)
	at com.sun.proxy.$Proxy117.isAlive(Unknown Source)
	at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1043)
	at hudson.maven.ProcessCache$MavenProcess.call(ProcessCache.java:166)
	at hudson.maven.MavenModuleSetBuild$MavenModuleSetBuildExecution.doRun(MavenModuleSetBuild.java:873)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:534)
	at hudson.model.Run.execute(Run.java:1728)
	at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:544)
	at hudson.model.ResourceController.execute(ResourceController.java:98)
	at hudson.model.Executor.run(Executor.java:404)
Caused by: hudson.remoting.Channel$OrderlyShutdown
	at hudson.remoting.Channel$CloseCommand.execute(Channel.java:1121)
	at hudson.remoting.Channel$1.handle(Channel.java:526)
	at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:83)
Caused by: Command close created at
	at hudson.remoting.Command.<init>(Command.java:59)
	at hudson.remoting.Channel$CloseCommand.<init>(Channel.java:1115)
	at hudson.remoting.Channel$CloseCommand.<init>(Channel.java:1113)
	at hudson.remoting.Channel.close(Channel.java:1273)
	at hudson.remoting.Channel.close(Channel.java:1255)
	at hudson.remoting.Channel$CloseCommand.execute(Channel.java:1120)
	... 2 more

Mime
View raw message