kylin-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "JerryShao (JIRA)" <>
Subject [jira] [Commented] (KYLIN-953) when running the cube job at "Convert Cuboid Data to HFile" step, an error is throw
Date Mon, 24 Aug 2015 09:44:45 GMT


JerryShao commented on KYLIN-953:

@ZhouQianhao I have resolved the problem, by add hadoop's conf file core-site.xml to webapps/kylin/WEB-INF/classes´╝îand
add the "hbase.fs.tmp.dir" property, because the problem is caused by check the value of this
property'. The source code like this:

static void configurePartitioner(Job job, List<ImmutableBytesWritable> splitPoints)
      throws IOException {
    Configuration conf = job.getConfiguration();
    // create the partitions file
    FileSystem fs = FileSystem.get(conf);
    Path partitionsPath = new Path(conf.get("hbase.fs.tmp.dir"), "partitions_" + UUID.randomUUID());
    writePartitions(conf, partitionsPath, splitPoints);

    // configure job to use it
    TotalOrderPartitioner.setPartitionFile(conf, partitionsPath);

At the end, there are two suggests for kylin's source: 
  1. Web ui project should  load the hadoop,hive,hbase's configration automaticly, but now
need copy to tomcat's classpath.
  2. The "hbase.fs.tmp.dir" property show be see as a hbase's conf, if can add to hbase-site.xml
is better, but now the kylin's relative code don't update the hbase's properties to the CubeHFileJob's

Above is my resovle way and suggets, may be not correct, if you have better way, please let
me know, thanks! 

> when running the cube job at "Convert Cuboid Data to HFile" step, an error is throw
> -----------------------------------------------------------------------------------
>                 Key: KYLIN-953
>                 URL:
>             Project: Kylin
>          Issue Type: Bug
>          Components: Job Engine
>    Affects Versions: v0.7.2
>            Reporter: JerryShao
>            Assignee: ZhouQianhao
> when cube job run at the "Convert Cuboid Data to HFile" step, throws an error like bellow:
> [pool-5-thread-8]:[2015-08-18 09:43:15,854][ERROR][]
- error in CubeHFileJ
> ob
> java.lang.IllegalArgumentException: Can not create a Path from a null string
>         at org.apache.hadoop.fs.Path.checkPathArg(
>         at org.apache.hadoop.fs.Path.<init>(
>         at org.apache.hadoop.fs.Path.<init>(
>         at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configurePartitioner(
>         at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(
>         at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat.configureIncrementalLoad(
>         at
>         at
>         at
>         at org.apache.kylin.job.common.MapReduceExecutable.doWork(
>         at org.apache.kylin.job.execution.AbstractExecutable.execute(
>         at org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(
>         at org.apache.kylin.job.execution.AbstractExecutable.execute(
>         at org.apache.kylin.job.impl.threadpool.DefaultScheduler$
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(
>         at java.util.concurrent.ThreadPoolExecutor$
>         at

This message was sent by Atlassian JIRA

View raw message