hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From AnilKumar B <akumarb2...@gmail.com>
Subject Re: Facing problem while using MultiTableOutputFormat
Date Tue, 07 Jan 2014 04:05:06 GMT
Hi Ted,

System.out.println( " Running with on tables "+args[1]+ "  and "+args[2]+"
 with zk "+args[3]);

What was the output from the above ?
>> Running with on tables ci_history  and ci_lookup  with zk 10.9.208.71

Tables are exist on hbase:
hbase(main):001:0> list
TABLE
ci_history
ci_lookup




Thanks & Regards,
B Anil Kumar.


On Tue, Jan 7, 2014 at 9:07 AM, Ted Yu <yuzhihong@gmail.com> wrote:

> System.out.println( " Running with on tables "+args[1]+ "  and "+args[2]+"
>  with zk "+args[3]);
>
> What was the output from the above ?
>
> I would expect a call similar to the following in your run() method - this
> comes from TestTableMapReduce.java:
>
>       TableMapReduceUtil.initTableReducerJob(
>         Bytes.toString(table.getTableName()),
>         IdentityTableReducer.class, job);
>
>
> On Mon, Jan 6, 2014 at 7:12 PM, AnilKumar B <akumarb2010@gmail.com> wrote:
>
> > Hi,
> >
> > In my MR job, I need to write output into multiple tables, So I am
> > using MultiTableOutputFormat as below. But I am getting
> > TableNotFoundException.
> >
> > I am attaching code snippet below, Is this the correct way to use
> > MultiTableOutputFormat ?
> >
> >
> >    Job class:
> >    public int run(String[] args) throws Exception {
> > System.out.println( " Running with on tables "+args[1]+ "  and
> "+args[2]+"
> >  with zk "+args[3]);
> > Configuration hbaseConf = HBaseConfiguration.create(getConf());
> > // hbaseConf.set(Constants.HBASE_ZOOKEEPER_QUORUM_PROP,
> > Constants.HBASE_OS_CL1_QUORUM);
> > hbaseConf.set(Constants.HBASE_ZOOKEEPER_QUORUM_PROP, args[3]);
> > Job job = new Job(hbaseConf);
> > job.setJarByClass(MultiTableTestJob.class);
> > job.setInputFormatClass(TextInputFormat.class);
> > job.setMapperClass(MultiTableTestMapper.class);
> > job.setMapOutputKeyClass(Text.class);
> > job.setMapOutputValueClass(Text.class);
> > job.setReducerClass(MultiTableTestReducer.class);
> > job.setOutputKeyClass(Text.class);
> > job.setOutputValueClass(Text.class);
> > FileInputFormat.setInputPaths(job, new Path(args[0]));
> > job.setOutputFormatClass(MultiTableOutputFormat.class);
> > TableMapReduceUtil.addDependencyJars(job);
> > TableMapReduceUtil.addDependencyJars(job.getConfiguration());
> > return job.waitForCompletion(true) == true ? 0 : -1;
> > }
> >  public static void main(String[] args) throws Exception{
> > Configuration configuration = new Configuration();
> > configuration.set("HBASE_DEST_TABLE", args[1]);
> > configuration.set("HBASE_LOOKUP_TABLE", args[2]);
> > ToolRunner.run(configuration, new CISuperSessionJob(), args);
> > }
> >
> > Reducer Class:
> >
> > private ImmutableBytesWritable tbl1;
> > private ImmutableBytesWritable tbl2;
> >
> > protected void setup(Context context) throws IOException
> > ,InterruptedException {
> > Configuration c =  context.getConfiguration();
> > tbl1 = new
> >
> >
> ImmutableBytesWritable(Bytes.toBytes(context.getConfiguration().get("HBASE_DEST_TABLE")));
> > tbl2 = new
> >
> >
> ImmutableBytesWritable(Bytes.toBytes(context.getConfiguration().get("HBASE_LOOKUP_TABLE")));
> > };
> >
> >               protected void reduce(Text key, java.lang.Iterable<Text>
> > values, Context context) throws IOException ,InterruptedException {
> >  //
> >  if (some condition) {
> > Put put = getSessionPut(key, vc);
> > if (put != null) {
> > context.write(tbl1, put);
> > }
> > } else {
> >                              //
> > Put put = getEventPut(key, vc);
> > context.write(tbl2, put);
> > }
> > }
> >                 }
> >
> >
> > Exception:
> > org.apache.hadoop.hbase.TableNotFoundException: mapred.reduce.tasks=100
> >         at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:999)
> >         at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:864)
> >         at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:821)
> >         at
> > org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:234)
> >         at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:174)
> >         at
> >
> >
> org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat$MultiTableRecordWriter.getTable(MultiTableOutputFormat.java:101)
> >         at
> >
> >
> org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat$MultiTableRecordWriter.write(MultiTableOutputFormat.java:127)
> >         at
> >
> >
> org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat$MultiTableRecordWriter.write(MultiTableOutputFormat.java:68)
> >         at
> >
> >
> org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:586)
> >         at
> >
> >
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
> >         at
> >
> >
> >
> > Thanks & Regards,
> > B Anil Kumar.
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message