carbondata-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: carbondata-master-spark-2.1 #2559
Date Sun, 01 Jul 2018 09:06:04 GMT
See <https://builds.apache.org/job/carbondata-master-spark-2.1/2559/display/redirect>

------------------------------------------
[...truncated 46.97 MB...]
+------------+-----------+
|Society_name|building_no|
+------------+-----------+
|TTTT        |5          |
+------------+-----------+

18/07/01 02:04:45 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleting
table [sdkoutputtable] under database [default]
18/07/01 02:04:45 ERROR DataMapStoreManager: ScalaTest-main-running-TestNonTransactionalCarbonTableWithComplexType
failed to get carbon table from table Path
18/07/01 02:04:45 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleted
table [sdkoutputtable] under database [default]
- test ComplexDataType projection for struct of struct -6 levels
TestComplexTypeWithBigArray:
18/07/01 02:04:45 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Creating
Table with Database name [default] and Table name [big_array]
18/07/01 02:04:45 WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data
source provider org.apache.spark.sql.CarbonSource. Persisting data source table `default`.`big_array`
into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
18/07/01 02:04:45 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Table
created with Database name [default] and Table name [big_array]
18/07/01 02:04:45 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load request has been received for table default.big_array
18/07/01 02:04:46 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load is successful for default.big_array
18/07/01 02:04:48 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleting
table [big_array] under database [default]
18/07/01 02:04:48 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleted
table [big_array] under database [default]
- test with big string array
18/07/01 02:04:48 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Creating
Table with Database name [default] and Table name [big_array]
18/07/01 02:04:48 WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data
source provider org.apache.spark.sql.CarbonSource. Persisting data source table `default`.`big_array`
into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
18/07/01 02:04:48 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Table
created with Database name [default] and Table name [big_array]
18/07/01 02:04:48 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load request has been received for table default.big_array
18/07/01 02:04:48 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load is successful for default.big_array
18/07/01 02:04:50 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleting
table [big_array] under database [default]
18/07/01 02:04:50 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleted
table [big_array] under database [default]
- test with big int array
18/07/01 02:04:50 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Creating
Table with Database name [default] and Table name [main]
TestIsNullFilter:
18/07/01 02:04:50 WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data
source provider org.apache.spark.sql.CarbonSource. Persisting data source table `default`.`main`
into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
18/07/01 02:04:50 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Table
created with Database name [default] and Table name [main]
18/07/01 02:04:50 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load request has been received for table default.main
18/07/01 02:04:51 WARN CarbonDataProcessorUtil: WriterForwardPool: main dir already exists,
skip dir creation: /tmp/carbon20073134078792540_0/Fact/Part0/Segment_0/0
18/07/01 02:04:51 ERROR DataLoadExecutor: [Executor task launch worker-10][partitionID:main;queryID:20073134065189328]
Data Load is partially success for table main
18/07/01 02:04:51 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load is successful for default.main
- select * from main where time is null
18/07/01 02:04:51 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleting
table [main] under database [default]
18/07/01 02:04:51 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleted
table [main] under database [default]
TestDataMapStatus:
18/07/01 02:04:51 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Creating
Table with Database name [default] and Table name [datamapstatustest]
18/07/01 02:04:51 WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data
source provider org.apache.spark.sql.CarbonSource. Persisting data source table `default`.`datamapstatustest`
into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
18/07/01 02:04:51 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Table
created with Database name [default] and Table name [datamapstatustest]
18/07/01 02:04:51 AUDIT CarbonCreateDataMapCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]DataMap
statusdatamap successfully added
18/07/01 02:04:51 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleting
table [datamapstatustest] under database [default]
18/07/01 02:04:51 WARN CarbonDropDataMapCommand: ScalaTest-main-running-TestDataMapStatus
Child table datamapstatustest_statusdatamap not found in metastore
18/07/01 02:04:51 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleted
table [datamapstatustest] under database [default]
- datamap status enable for new datamap
18/07/01 02:04:51 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Creating
Table with Database name [default] and Table name [datamapstatustest]
18/07/01 02:04:51 WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data
source provider org.apache.spark.sql.CarbonSource. Persisting data source table `default`.`datamapstatustest`
into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
18/07/01 02:04:51 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Table
created with Database name [default] and Table name [datamapstatustest]
18/07/01 02:04:51 AUDIT CarbonCreateDataMapCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]DataMap
statusdatamap successfully added
18/07/01 02:04:51 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleting
table [datamapstatustest] under database [default]
18/07/01 02:04:51 WARN CarbonDropDataMapCommand: ScalaTest-main-running-TestDataMapStatus
Child table datamapstatustest_statusdatamap not found in metastore
18/07/01 02:04:51 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleted
table [datamapstatustest] under database [default]
- datamap status disable for new datamap with deferred rebuild
18/07/01 02:04:51 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Creating
Table with Database name [default] and Table name [datamapstatustest1]
18/07/01 02:04:51 WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data
source provider org.apache.spark.sql.CarbonSource. Persisting data source table `default`.`datamapstatustest1`
into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
18/07/01 02:04:51 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Table
created with Database name [default] and Table name [datamapstatustest1]
18/07/01 02:04:51 AUDIT CarbonCreateDataMapCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]DataMap
statusdatamap1 successfully added
18/07/01 02:04:51 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load request has been received for table default.datamapstatustest1
18/07/01 02:04:51 WARN CarbonDataProcessorUtil: WriterForwardPool: datamapstatustest1 dir
already exists, skip dir creation: /tmp/carbon20073135055501571_0/Fact/Part0/Segment_0/0
18/07/01 02:04:52 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load is successful for default.datamapstatustest1
18/07/01 02:04:52 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleting
table [datamapstatustest1] under database [default]
18/07/01 02:04:52 WARN CarbonDropDataMapCommand: ScalaTest-main-running-TestDataMapStatus
Child table datamapstatustest1_statusdatamap1 not found in metastore
18/07/01 02:04:52 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleted
table [datamapstatustest1] under database [default]
- datamap status disable after new load  with deferred rebuild
18/07/01 02:04:52 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Creating
Table with Database name [default] and Table name [datamapstatustest2]
18/07/01 02:04:52 WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data
source provider org.apache.spark.sql.CarbonSource. Persisting data source table `default`.`datamapstatustest2`
into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
18/07/01 02:04:52 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Table
created with Database name [default] and Table name [datamapstatustest2]
18/07/01 02:04:52 AUDIT CarbonCreateDataMapCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]DataMap
statusdatamap2 successfully added
18/07/01 02:04:52 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load request has been received for table default.datamapstatustest2
18/07/01 02:04:52 WARN CarbonDataProcessorUtil: WriterForwardPool: datamapstatustest2 dir
already exists, skip dir creation: /tmp/carbon20073135478353484_0/Fact/Part0/Segment_0/0
18/07/01 02:04:52 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load is successful for default.datamapstatustest2
18/07/01 02:04:52 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleting
table [datamapstatustest2] under database [default]
18/07/01 02:04:52 WARN CarbonDropDataMapCommand: ScalaTest-main-running-TestDataMapStatus
Child table datamapstatustest2_statusdatamap2 not found in metastore
18/07/01 02:04:52 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleted
table [datamapstatustest2] under database [default]
- datamap status with REBUILD DATAMAP
18/07/01 02:04:52 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Creating
Table with Database name [default] and Table name [datamapstatustest3]
18/07/01 02:04:52 WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data
source provider org.apache.spark.sql.CarbonSource. Persisting data source table `default`.`datamapstatustest3`
into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
18/07/01 02:04:52 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Table
created with Database name [default] and Table name [datamapstatustest3]
18/07/01 02:04:52 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleting
table [datamapstatustest3] under database [default]
18/07/01 02:04:52 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleted
table [datamapstatustest3] under database [default]
- datamap create without on table test
18/07/01 02:04:52 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Creating
Table with Database name [default] and Table name [datamapstatustest3]
18/07/01 02:04:52 WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data
source provider org.apache.spark.sql.CarbonSource. Persisting data source table `default`.`datamapstatustest3`
into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
18/07/01 02:04:52 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Table
created with Database name [default] and Table name [datamapstatustest3]
18/07/01 02:04:52 AUDIT CarbonCreateDataMapCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]DataMap
statusdatamap3 successfully added
18/07/01 02:04:53 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load request has been received for table default.datamapstatustest3
18/07/01 02:04:53 WARN CarbonDataProcessorUtil: WriterForwardPool: datamapstatustest3 dir
already exists, skip dir creation: /tmp/carbon20073136195799726_0/Fact/Part0/Segment_0/0
18/07/01 02:04:53 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load is successful for default.datamapstatustest3
18/07/01 02:04:53 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleting
table [datamapstatustest3] under database [default]
18/07/01 02:04:53 WARN CarbonDropDataMapCommand: ScalaTest-main-running-TestDataMapStatus
Child table datamapstatustest3_statusdatamap3 not found in metastore
18/07/01 02:04:53 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleted
table [datamapstatustest3] under database [default]
- rebuild datamap status
18/07/01 02:04:53 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Creating
Table with Database name [default] and Table name [carbontable]
TestAvgForBigInt:
18/07/01 02:04:53 WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data
source provider org.apache.spark.sql.CarbonSource. Persisting data source table `default`.`carbontable`
into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
18/07/01 02:04:53 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Table
created with Database name [default] and Table name [carbontable]
18/07/01 02:04:53 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load request has been received for table default.carbontable
18/07/01 02:04:53 WARN CarbonDataProcessorUtil: WriterForwardPool: carbontable dir already
exists, skip dir creation: /tmp/carbon20073136841620908_0/Fact/Part0/Segment_0/0
18/07/01 02:04:53 ERROR DataLoadExecutor: [Executor task launch worker-10][partitionID:carbontable;queryID:20073136829668309]
Data Load is partially success for table carbontable
18/07/01 02:04:53 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load is successful for default.carbontable
- test avg function on big int column
18/07/01 02:04:54 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleting
table [carbontable] under database [default]
18/07/01 02:04:54 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleted
table [carbontable] under database [default]
BlockPruneQueryTestCase:
18/07/01 02:04:54 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Creating
Table with Database name [default] and Table name [blockprune]
18/07/01 02:04:54 WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data
source provider org.apache.spark.sql.CarbonSource. Persisting data source table `default`.`blockprune`
into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
18/07/01 02:04:55 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Table
created with Database name [default] and Table name [blockprune]
18/07/01 02:04:55 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load request has been received for table default.blockprune
18/07/01 02:04:56 WARN CarbonDataProcessorUtil: WriterForwardPool: blockprune dir already
exists, skip dir creation: /tmp/carbon20073138421253711_0/Fact/Part0/Segment_0/0
18/07/01 02:04:57 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load is successful for default.blockprune
- test block prune query
18/07/01 02:05:02 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleting
table [blockprune] under database [default]
18/07/01 02:05:02 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleted
table [blockprune] under database [default]
FGDataMapTestCase:
18/07/01 02:05:02 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Creating
Table with Database name [default] and Table name [normal_test]
18/07/01 02:05:02 WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data
source provider org.apache.spark.sql.CarbonSource. Persisting data source table `default`.`normal_test`
into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
18/07/01 02:05:02 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Table
created with Database name [default] and Table name [normal_test]
18/07/01 02:05:02 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load request has been received for table default.normal_test
18/07/01 02:05:05 WARN CarbonDataProcessorUtil: WriterForwardPool: normal_test dir already
exists, skip dir creation: /tmp/carbon20073145914187267_0/Fact/Part0/Segment_0/0
18/07/01 02:05:06 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load is successful for default.normal_test
18/07/01 02:05:06 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Creating
Table with Database name [default] and Table name [datamap_test]
18/07/01 02:05:06 WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data
source provider org.apache.spark.sql.CarbonSource. Persisting data source table `default`.`datamap_test`
into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
18/07/01 02:05:06 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Table
created with Database name [default] and Table name [datamap_test]
18/07/01 02:05:06 AUDIT CarbonCreateDataMapCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]DataMap
ggdatamap successfully added
18/07/01 02:05:06 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load request has been received for table default.datamap_test
18/07/01 02:05:08 WARN CarbonDataProcessorUtil: WriterForwardPool: datamap_test dir already
exists, skip dir creation: /tmp/carbon20073149700873058_0/Fact/Part0/Segment_0/0
18/07/01 02:05:18 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load is successful for default.datamap_test
- test fg datamap
18/07/01 02:05:20 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleting
table [datamap_test] under database [default]
18/07/01 02:05:21 WARN CarbonDropDataMapCommand: ScalaTest-main-running-FGDataMapTestCase
Child table datamap_test_ggdatamap not found in metastore
18/07/01 02:05:21 AUDIT CarbonDropTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Deleted
table [datamap_test] under database [default]
18/07/01 02:05:21 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Creating
Table with Database name [default] and Table name [datamap_test]
18/07/01 02:05:21 WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data
source provider org.apache.spark.sql.CarbonSource. Persisting data source table `default`.`datamap_test`
into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
18/07/01 02:05:21 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Table
created with Database name [default] and Table name [datamap_test]
18/07/01 02:05:21 AUDIT CarbonCreateDataMapCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]DataMap
ggdatamap1 successfully added
18/07/01 02:05:21 AUDIT CarbonCreateDataMapCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]DataMap
ggdatamap2 successfully added
18/07/01 02:05:21 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load request has been received for table default.datamap_test
18/07/01 02:05:23 WARN CarbonDataProcessorUtil: WriterForwardPool: datamap_test dir already
exists, skip dir creation: /tmp/carbon20073164453534915_0/Fact/Part0/Segment_0/0
18/07/01 02:05:36 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load is successful for default.datamap_test
- test fg datamap with 2 datamaps 
18/07/01 02:05:39 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Creating
Table with Database name [default] and Table name [datamap_testfg]
18/07/01 02:05:39 WARN HiveExternalCatalog: Couldn't find corresponding Hive SerDe for data
source provider org.apache.spark.sql.CarbonSource. Persisting data source table `default`.`datamap_testfg`
into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
18/07/01 02:05:39 AUDIT CarbonCreateTableCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Table
created with Database name [default] and Table name [datamap_testfg]
18/07/01 02:05:39 AUDIT CarbonCreateDataMapCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]DataMap
datamap1 successfully added
18/07/01 02:05:39 AUDIT CarbonCreateDataMapCommand: [asf931.gq1.ygridcore.net][jenkins][Thread-1]DataMap
datamap2 successfully added
18/07/01 02:05:39 AUDIT CarbonDataRDDFactory$: [asf931.gq1.ygridcore.net][jenkins][Thread-1]Data
load request has been received for table default.datamap_testfg
18/07/01 02:05:42 WARN CarbonDataProcessorUtil: WriterForwardPool: datamap_testfg dir already
exists, skip dir creation: /tmp/carbon20073183059123145_0/Fact/Part0/Segment_0/0
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
Sending e-mails to: commits@carbondata.apache.org
ERROR: Failed to parse POMs
hudson.remoting.ChannelClosedException: channel is already closed
	at hudson.remoting.Channel.send(Channel.java:671)
	at hudson.remoting.ProxyOutputStream.write(ProxyOutputStream.java:144)
	at hudson.remoting.RemoteOutputStream.write(RemoteOutputStream.java:108)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
	at hudson.remoting.ChunkedOutputStream.sendBreak(ChunkedOutputStream.java:63)
	at hudson.remoting.ChunkedCommandTransport.writeBlock(ChunkedCommandTransport.java:46)
	at hudson.remoting.AbstractSynchronousByteArrayCommandTransport.write(AbstractSynchronousByteArrayCommandTransport.java:45)
	at hudson.remoting.Channel.send(Channel.java:675)
	at hudson.remoting.Request.call(Request.java:203)
	at hudson.remoting.Channel.call(Channel.java:907)
	at hudson.maven.ProcessCache$MavenProcess.call(ProcessCache.java:161)
	at hudson.maven.MavenModuleSetBuild$MavenModuleSetBuildExecution.doRun(MavenModuleSetBuild.java:879)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:504)
	at hudson.model.Run.execute(Run.java:1724)
	at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:543)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:429)
Caused by: java.io.IOException
	at hudson.remoting.Channel.close(Channel.java:1402)
	at hudson.remoting.Channel.close(Channel.java:1358)
	at hudson.slaves.SlaveComputer.closeChannel(SlaveComputer.java:737)
	at hudson.slaves.SlaveComputer.kill(SlaveComputer.java:704)
	at hudson.model.AbstractCIBase.killComputer(AbstractCIBase.java:88)
	at jenkins.model.Jenkins.access$2000(Jenkins.java:304)
	at jenkins.model.Jenkins$20.run(Jenkins.java:3367)
	at hudson.model.Queue._withLock(Queue.java:1373)
	at hudson.model.Queue.withLock(Queue.java:1250)
	at jenkins.model.Jenkins._cleanUpDisconnectComputers(Jenkins.java:3361)
	at jenkins.model.Jenkins.cleanUp(Jenkins.java:3237)
	at hudson.WebAppMain.contextDestroyed(WebAppMain.java:379)
	at org.apache.catalina.core.StandardContext.listenerStop(StandardContext.java:4690)
	at org.apache.catalina.core.StandardContext.stopInternal(StandardContext.java:5327)
	at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
	at org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1441)
	at org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1430)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:134)
	at org.apache.catalina.core.ContainerBase.stopInternal(ContainerBase.java:997)
	at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
	at org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1441)
	at org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1430)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:134)
	at org.apache.catalina.core.ContainerBase.stopInternal(ContainerBase.java:997)
	at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
	at org.apache.catalina.core.StandardService.stopInternal(StandardService.java:471)
	at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
	at org.apache.catalina.core.StandardServer.stopInternal(StandardServer.java:791)
	at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
	at org.apache.catalina.startup.Catalina.stop(Catalina.java:744)
	at org.apache.catalina.startup.Catalina$CatalinaShutdownHook.run(Catalina.java:845)
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
ERROR: H31 is offline; cannot locate JDK 1.8 (latest)
ERROR: H31 is offline; cannot locate Maven 3.3.9
Not sending mail to unregistered user jacky.likun@qq.com

Mime
View raw message