mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dmitriy Lyubimov <dlie...@gmail.com>
Subject Re: Problems Running Mahout SSVD
Date Tue, 12 Feb 2013 02:33:25 GMT
Ok, so you are using the DRM.

but basically what it means is that block solver of QR cannot solve it due
to rank deficiency if any of your splits contain less than k+p rows of
input -- i suggest you to investigate your splitting along those lines. I
agree message is internal to QR solver and perhaps there's an easy way to
intercept that condition sooner and form a more meaningful message. I
remember it was difficult for some reason but perhaps it is worth looking
into this again.

In older Hadoop distribution i never saw this happening if input itself is
well-formed (here i mean every file in the input has at least k+p rows) and
in fact studied the SequenceInputFormat code to convince myself this is
generally not going to happen to the "leftover" splits.   There are options
to increase splits for extradense extrawide inputs (alhtough your case
doesn't seem be even remotely that).

 Most cases people complained about was either them having multiple small
files, or actually even the entire matrix was rank-deficient. Yes there was
strange with embedded use and relative paths.

I also noticed you seem to be using a local mode MR (just as the tests that
run with every Mahout build) but local mode has (or at least used to have)
numerous limitations (e.g. it cannot run multiple reducers, so number of
reducers is ignored).



On Mon, Feb 11, 2013 at 5:54 PM, Dmitriy Lyubimov <dlieu.7@gmail.com> wrote:

> Also, Mahout's distributed algebra operates on distributed row matrix
> format (which is a sequence file of Vectors). I am a little bit confused
> how you are able to run that stuff on the text input? Most likely this file
> is just ignored because it is not a sequence file and your input ends up
> being 0 rows which is of course rank deficient for the purposes of this
> computation. You need distributed row matrix input.
>
> -d
>
>
>
> On Mon, Feb 11, 2013 at 5:50 PM, Dmitriy Lyubimov <dlieu.7@gmail.com>wrote:
>
>> Yes this problem has been pretty much beaten to shreds. In fact so much
>> so i wrote it into troubleshooting in section 5 of the manual (
>> https://cwiki.apache.org/confluence/download/attachments/27832158/SSVD-CLI.pdf?version=17&modificationDate=1349999085000
>> ).
>>
>> Are you sure those are not you problems? In short, it means rank
>> deficiency in your input or its splits.
>>
>> Also perhaps it would be great if you post to the user list so others
>> could benefit.
>>
>> Thank you.
>>
>> -Dmitriy
>>
>>
>> On Mon, Feb 11, 2013 at 5:40 PM, K.D.P. Ross <kdp@quixey.com> wrote:
>>
>>> Hello–
>>>
>>> I am trying to use the Mahout SSVD to decompose
>>> a modest-sized matrix, but I get an error when I try to run
>>> it … I am primarily interested in using the ‘SVDSolver’
>>> entry point, but I encounter the same problem when I use the
>>> CLI.
>>>
>>> I am generating random data for the moment (code included
>>> below) and invoking the CLI with the following arguments:
>>>
>>>      mahout ssvd -k 100 -p 10 -q 1 --input /tmp/foo.txt --output
>>> /tmp/output --tempDir /tmp/temp --reduceTasks 4
>>>
>>> (where ‘/tmp/foo.txt’ is generated by the code below,
>>> and neither ‘/tmp/output’ nor ‘/tmp/temp’ exists.)
>>>
>>> Specifically, I receive an ‘IllegalArgumentException’
>>> saying that ‘new m can't be less than n’ … which, at
>>> best, is confusing because ‘m’ and ‘n’ appear to be
>>> variables coming from somewhere in the implementation rather
>>> than from any parameters that I've explicitly provided.
>>>
>>> I've searched the Web and found others running into this
>>> problem … but no solution (e.g., switching to absolute
>>> paths for everything) that seems to address it for me …
>>>
>>> I'm running Mahout 0.7 and Hadoop 1.1.1 on Ubuntu 12.10 with
>>> version 1.6.0_24 of the AMD64 OpenJDK; I am running
>>> i7-3820 @ 3.60GHz x 8 with 64GiB RAM.
>>>
>>> Any help would be much appreciated!
>>>
>>> //sincerely
>>> K.D.P.Ross | Principal Engineer, Data Sciences
>>> (206) 817.0090 | www.quixey.com
>>> 278 Castro Street, Mountain View, CA 94041
>>>
>>> [begin code used to generate matrix-
>>> import org.apache.hadoop.conf.Configuration;
>>> import org.apache.hadoop.fs.FileSystem;
>>> import org.apache.hadoop.fs.Path;
>>> import org.apache.hadoop.io.IntWritable;
>>> import org.apache.hadoop.io.SequenceFile;
>>> import org.apache.mahout.math.DenseVector;
>>> import org.apache.mahout.math.VectorWritable;
>>> import java.util.Random;
>>>
>>> public class GenMatrix {
>>>   public static double[][] fabricateRandomMatrix(final int rMax, final
>>> int cMax) {
>>>     Random rand = new Random();
>>>     double [][] xs = new double[rMax][cMax];
>>>
>>>     for (int r = 0; r < rMax; ++ r) {
>>>       for (int c = 0; c < cMax; ++ c) {
>>>         xs[r][c] = rand.nextDouble();
>>>       }
>>>     }
>>>
>>>     return xs;
>>>   }
>>>
>>>   public static void write(final String f, final double[][]
>>> matrixvalues) throws Exception
>>>   {
>>>     Configuration cfg = new Configuration();
>>>     FileSystem fs = FileSystem.get(cfg);
>>>     SequenceFile.Writer writer = SequenceFile.createWriter(fs, cfg,
>>> new Path(f),IntWritable.class, VectorWritable.class) ;
>>>
>>>     for (int i = 0; i < matrixvalues.length; ++i) {
>>>       final DenseVector row = new DenseVector(matrixvalues[ i ]);
>>>       writer.append(new IntWritable(i), new VectorWritable(row));
>>>     }
>>>
>>>     writer.close();
>>>   }
>>>
>>>   public static void main(String[] args) {
>>>     final int r = 25000;
>>>     final int c = 10000;
>>>
>>>     try {
>>>       write("/tmp/foo.txt", fabricateRandomMatrix(r, c));
>>>     }
>>>     catch (Exception e) {
>>>       System.err.println("Unable to write data ... " + e.getMessage());
>>>     }
>>>   }
>>> }
>>> -end code used to generate matrix]
>>>
>>> [begin mahout output-
>>> 13/02/11 17:03:16 INFO common.AbstractJob: Command line arguments:
>>> {--abtBlockHeight=[200000], --blockHeight=[10000], --broadcast=[true],
>>> --computeU=[true], --computeV=[true], --endPhase=[2147483647],
>>> --input=[foo.txt], --minSplitSize=[-1],
>>> --outerProdBlockHeight=[30000], --output=[output],
>>> --oversampling=[10], --pca=[false], --powerIter=[1], --rank=[100],
>>> --reduceTasks=[4], --startPhase=[0], --tempDir=[temp],
>>> --uHalfSigma=[false], --vHalfSigma=[false]}
>>> 13/02/11 17:03:17 INFO util.NativeCodeLoader: Loaded the native-hadoop
>>> library
>>> 13/02/11 17:03:17 INFO zlib.ZlibFactory: Successfully loaded &
>>> initialized native-zlib library
>>> 13/02/11 17:03:17 INFO compress.CodecPool: Got brand-new decompressor
>>> 13/02/11 17:03:17 INFO input.FileInputFormat: Total input paths to
>>> process : 1
>>> 13/02/11 17:03:17 INFO util.ProcessTree: setsid exited with exit code 0
>>> 13/02/11 17:03:17 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@60ded0f0
>>> 13/02/11 17:03:20 INFO compress.CodecPool: Got brand-new compressor
>>> 13/02/11 17:03:20 INFO compress.CodecPool: Got brand-new compressor
>>> 13/02/11 17:03:20 INFO mapred.Task: Task:attempt_local_0001_m_000000_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:03:20 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:20 INFO mapred.Task: Task attempt_local_0001_m_000000_0
>>> is allowed to commit now
>>> 13/02/11 17:03:20 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000000_0' to temp/Q-job
>>> 13/02/11 17:03:20 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:20 INFO mapred.Task: Task 'attempt_local_0001_m_000000_0'
>>> done.
>>> 13/02/11 17:03:20 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@2322bce
>>> 13/02/11 17:03:23 INFO mapred.Task: Task:attempt_local_0001_m_000001_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:03:23 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:23 INFO mapred.Task: Task attempt_local_0001_m_000001_0
>>> is allowed to commit now
>>> 13/02/11 17:03:23 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000001_0' to temp/Q-job
>>> 13/02/11 17:03:23 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:23 INFO mapred.Task: Task 'attempt_local_0001_m_000001_0'
>>> done.
>>> 13/02/11 17:03:23 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@ace16ad
>>> 13/02/11 17:03:26 INFO mapred.Task: Task:attempt_local_0001_m_000002_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:03:26 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:26 INFO mapred.Task: Task attempt_local_0001_m_000002_0
>>> is allowed to commit now
>>> 13/02/11 17:03:26 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000002_0' to temp/Q-job
>>> 13/02/11 17:03:26 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:26 INFO mapred.Task: Task 'attempt_local_0001_m_000002_0'
>>> done.
>>> 13/02/11 17:03:26 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@79f03d7
>>> 13/02/11 17:03:29 INFO mapred.Task: Task:attempt_local_0001_m_000003_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:03:29 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:29 INFO mapred.Task: Task attempt_local_0001_m_000003_0
>>> is allowed to commit now
>>> 13/02/11 17:03:29 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000003_0' to temp/Q-job
>>> 13/02/11 17:03:29 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:29 INFO mapred.Task: Task 'attempt_local_0001_m_000003_0'
>>> done.
>>> 13/02/11 17:03:29 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@2d20dbf3
>>> 13/02/11 17:03:32 INFO mapred.Task: Task:attempt_local_0001_m_000004_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:03:32 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:32 INFO mapred.Task: Task attempt_local_0001_m_000004_0
>>> is allowed to commit now
>>> 13/02/11 17:03:32 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000004_0' to temp/Q-job
>>> 13/02/11 17:03:32 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:32 INFO mapred.Task: Task 'attempt_local_0001_m_000004_0'
>>> done.
>>> 13/02/11 17:03:32 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@5a943dc4
>>> 13/02/11 17:03:35 INFO mapred.Task: Task:attempt_local_0001_m_000005_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:03:35 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:35 INFO mapred.Task: Task attempt_local_0001_m_000005_0
>>> is allowed to commit now
>>> 13/02/11 17:03:35 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000005_0' to temp/Q-job
>>> 13/02/11 17:03:35 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:35 INFO mapred.Task: Task 'attempt_local_0001_m_000005_0'
>>> done.
>>> 13/02/11 17:03:35 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@b0c0f66
>>> 13/02/11 17:03:38 INFO mapred.Task: Task:attempt_local_0001_m_000006_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:03:38 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:38 INFO mapred.Task: Task attempt_local_0001_m_000006_0
>>> is allowed to commit now
>>> 13/02/11 17:03:38 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000006_0' to temp/Q-job
>>> 13/02/11 17:03:38 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:38 INFO mapred.Task: Task 'attempt_local_0001_m_000006_0'
>>> done.
>>> 13/02/11 17:03:38 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@75d252d
>>> 13/02/11 17:03:41 INFO mapred.Task: Task:attempt_local_0001_m_000007_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:03:41 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:41 INFO mapred.Task: Task attempt_local_0001_m_000007_0
>>> is allowed to commit now
>>> 13/02/11 17:03:41 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000007_0' to temp/Q-job
>>> 13/02/11 17:03:41 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:41 INFO mapred.Task: Task 'attempt_local_0001_m_000007_0'
>>> done.
>>> 13/02/11 17:03:41 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@72e5355f
>>> 13/02/11 17:03:44 INFO mapred.Task: Task:attempt_local_0001_m_000008_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:03:44 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:44 INFO mapred.Task: Task attempt_local_0001_m_000008_0
>>> is allowed to commit now
>>> 13/02/11 17:03:44 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000008_0' to temp/Q-job
>>> 13/02/11 17:03:44 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:44 INFO mapred.Task: Task 'attempt_local_0001_m_000008_0'
>>> done.
>>> 13/02/11 17:03:44 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@172fb0af
>>> 13/02/11 17:03:47 INFO mapred.Task: Task:attempt_local_0001_m_000009_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:03:47 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:47 INFO mapred.Task: Task attempt_local_0001_m_000009_0
>>> is allowed to commit now
>>> 13/02/11 17:03:47 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000009_0' to temp/Q-job
>>> 13/02/11 17:03:47 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:47 INFO mapred.Task: Task 'attempt_local_0001_m_000009_0'
>>> done.
>>> 13/02/11 17:03:47 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@29abc69
>>> 13/02/11 17:03:50 INFO mapred.Task: Task:attempt_local_0001_m_000010_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:03:50 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:50 INFO mapred.Task: Task attempt_local_0001_m_000010_0
>>> is allowed to commit now
>>> 13/02/11 17:03:50 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000010_0' to temp/Q-job
>>> 13/02/11 17:03:50 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:50 INFO mapred.Task: Task 'attempt_local_0001_m_000010_0'
>>> done.
>>> 13/02/11 17:03:50 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@53dafbaf
>>> 13/02/11 17:03:53 INFO mapred.Task: Task:attempt_local_0001_m_000011_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:03:53 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:53 INFO mapred.Task: Task attempt_local_0001_m_000011_0
>>> is allowed to commit now
>>> 13/02/11 17:03:53 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000011_0' to temp/Q-job
>>> 13/02/11 17:03:53 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:53 INFO mapred.Task: Task 'attempt_local_0001_m_000011_0'
>>> done.
>>> 13/02/11 17:03:53 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@38ffd135
>>> 13/02/11 17:03:56 INFO mapred.Task: Task:attempt_local_0001_m_000012_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:03:56 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:56 INFO mapred.Task: Task attempt_local_0001_m_000012_0
>>> is allowed to commit now
>>> 13/02/11 17:03:56 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000012_0' to temp/Q-job
>>> 13/02/11 17:03:56 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:56 INFO mapred.Task: Task 'attempt_local_0001_m_000012_0'
>>> done.
>>> 13/02/11 17:03:56 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@4987b287
>>> 13/02/11 17:03:59 INFO mapred.Task: Task:attempt_local_0001_m_000013_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:03:59 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:59 INFO mapred.Task: Task attempt_local_0001_m_000013_0
>>> is allowed to commit now
>>> 13/02/11 17:03:59 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000013_0' to temp/Q-job
>>> 13/02/11 17:03:59 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:03:59 INFO mapred.Task: Task 'attempt_local_0001_m_000013_0'
>>> done.
>>> 13/02/11 17:03:59 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@13cc0a7f
>>> 13/02/11 17:04:02 INFO mapred.Task: Task:attempt_local_0001_m_000014_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:02 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:02 INFO mapred.Task: Task attempt_local_0001_m_000014_0
>>> is allowed to commit now
>>> 13/02/11 17:04:02 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000014_0' to temp/Q-job
>>> 13/02/11 17:04:02 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:02 INFO mapred.Task: Task 'attempt_local_0001_m_000014_0'
>>> done.
>>> 13/02/11 17:04:02 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@327124af
>>> 13/02/11 17:04:05 INFO mapred.Task: Task:attempt_local_0001_m_000015_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:05 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:05 INFO mapred.Task: Task attempt_local_0001_m_000015_0
>>> is allowed to commit now
>>> 13/02/11 17:04:05 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000015_0' to temp/Q-job
>>> 13/02/11 17:04:05 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:05 INFO mapred.Task: Task 'attempt_local_0001_m_000015_0'
>>> done.
>>> 13/02/11 17:04:05 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@18682406
>>> 13/02/11 17:04:08 INFO mapred.Task: Task:attempt_local_0001_m_000016_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:08 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:08 INFO mapred.Task: Task attempt_local_0001_m_000016_0
>>> is allowed to commit now
>>> 13/02/11 17:04:08 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000016_0' to temp/Q-job
>>> 13/02/11 17:04:08 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:08 INFO mapred.Task: Task 'attempt_local_0001_m_000016_0'
>>> done.
>>> 13/02/11 17:04:08 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1c5aebd9
>>> 13/02/11 17:04:11 INFO mapred.Task: Task:attempt_local_0001_m_000017_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:11 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:11 INFO mapred.Task: Task attempt_local_0001_m_000017_0
>>> is allowed to commit now
>>> 13/02/11 17:04:11 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000017_0' to temp/Q-job
>>> 13/02/11 17:04:11 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:11 INFO mapred.Task: Task 'attempt_local_0001_m_000017_0'
>>> done.
>>> 13/02/11 17:04:11 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@32d8ca48
>>> 13/02/11 17:04:14 INFO mapred.Task: Task:attempt_local_0001_m_000018_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:14 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:14 INFO mapred.Task: Task attempt_local_0001_m_000018_0
>>> is allowed to commit now
>>> 13/02/11 17:04:14 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000018_0' to temp/Q-job
>>> 13/02/11 17:04:14 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:14 INFO mapred.Task: Task 'attempt_local_0001_m_000018_0'
>>> done.
>>> 13/02/11 17:04:14 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1c2c9103
>>> 13/02/11 17:04:17 INFO mapred.Task: Task:attempt_local_0001_m_000019_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:17 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:17 INFO mapred.Task: Task attempt_local_0001_m_000019_0
>>> is allowed to commit now
>>> 13/02/11 17:04:17 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000019_0' to temp/Q-job
>>> 13/02/11 17:04:17 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:17 INFO mapred.Task: Task 'attempt_local_0001_m_000019_0'
>>> done.
>>> 13/02/11 17:04:17 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@5ad75c47
>>> 13/02/11 17:04:20 INFO mapred.Task: Task:attempt_local_0001_m_000020_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:20 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:20 INFO mapred.Task: Task attempt_local_0001_m_000020_0
>>> is allowed to commit now
>>> 13/02/11 17:04:20 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000020_0' to temp/Q-job
>>> 13/02/11 17:04:20 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:20 INFO mapred.Task: Task 'attempt_local_0001_m_000020_0'
>>> done.
>>> 13/02/11 17:04:20 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@d522de2
>>> 13/02/11 17:04:23 INFO mapred.Task: Task:attempt_local_0001_m_000021_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:23 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:23 INFO mapred.Task: Task attempt_local_0001_m_000021_0
>>> is allowed to commit now
>>> 13/02/11 17:04:23 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000021_0' to temp/Q-job
>>> 13/02/11 17:04:23 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:23 INFO mapred.Task: Task 'attempt_local_0001_m_000021_0'
>>> done.
>>> 13/02/11 17:04:23 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@44aea710
>>> 13/02/11 17:04:26 INFO mapred.Task: Task:attempt_local_0001_m_000022_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:26 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:26 INFO mapred.Task: Task attempt_local_0001_m_000022_0
>>> is allowed to commit now
>>> 13/02/11 17:04:26 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000022_0' to temp/Q-job
>>> 13/02/11 17:04:26 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:26 INFO mapred.Task: Task 'attempt_local_0001_m_000022_0'
>>> done.
>>> 13/02/11 17:04:26 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@7bafb0c7
>>> 13/02/11 17:04:29 INFO mapred.Task: Task:attempt_local_0001_m_000023_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:29 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:29 INFO mapred.Task: Task attempt_local_0001_m_000023_0
>>> is allowed to commit now
>>> 13/02/11 17:04:29 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000023_0' to temp/Q-job
>>> 13/02/11 17:04:29 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:29 INFO mapred.Task: Task 'attempt_local_0001_m_000023_0'
>>> done.
>>> 13/02/11 17:04:29 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@63a5ec6c
>>> 13/02/11 17:04:32 INFO mapred.Task: Task:attempt_local_0001_m_000024_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:32 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:32 INFO mapred.Task: Task attempt_local_0001_m_000024_0
>>> is allowed to commit now
>>> 13/02/11 17:04:32 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000024_0' to temp/Q-job
>>> 13/02/11 17:04:32 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:32 INFO mapred.Task: Task 'attempt_local_0001_m_000024_0'
>>> done.
>>> 13/02/11 17:04:32 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1414627a
>>> 13/02/11 17:04:35 INFO mapred.Task: Task:attempt_local_0001_m_000025_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:35 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:35 INFO mapred.Task: Task attempt_local_0001_m_000025_0
>>> is allowed to commit now
>>> 13/02/11 17:04:35 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000025_0' to temp/Q-job
>>> 13/02/11 17:04:35 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:35 INFO mapred.Task: Task 'attempt_local_0001_m_000025_0'
>>> done.
>>> 13/02/11 17:04:35 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@729b1670
>>> 13/02/11 17:04:38 INFO mapred.Task: Task:attempt_local_0001_m_000026_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:38 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:38 INFO mapred.Task: Task attempt_local_0001_m_000026_0
>>> is allowed to commit now
>>> 13/02/11 17:04:38 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000026_0' to temp/Q-job
>>> 13/02/11 17:04:38 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:38 INFO mapred.Task: Task 'attempt_local_0001_m_000026_0'
>>> done.
>>> 13/02/11 17:04:38 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1f3a34af
>>> 13/02/11 17:04:41 INFO mapred.Task: Task:attempt_local_0001_m_000027_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:41 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:41 INFO mapred.Task: Task attempt_local_0001_m_000027_0
>>> is allowed to commit now
>>> 13/02/11 17:04:41 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000027_0' to temp/Q-job
>>> 13/02/11 17:04:41 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:41 INFO mapred.Task: Task 'attempt_local_0001_m_000027_0'
>>> done.
>>> 13/02/11 17:04:41 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@3823bdd1
>>> 13/02/11 17:04:44 INFO mapred.Task: Task:attempt_local_0001_m_000028_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:44 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:44 INFO mapred.Task: Task attempt_local_0001_m_000028_0
>>> is allowed to commit now
>>> 13/02/11 17:04:44 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000028_0' to temp/Q-job
>>> 13/02/11 17:04:44 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:44 INFO mapred.Task: Task 'attempt_local_0001_m_000028_0'
>>> done.
>>> 13/02/11 17:04:44 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@3bd840d9
>>> 13/02/11 17:04:47 INFO mapred.Task: Task:attempt_local_0001_m_000029_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:47 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:47 INFO mapred.Task: Task attempt_local_0001_m_000029_0
>>> is allowed to commit now
>>> 13/02/11 17:04:47 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000029_0' to temp/Q-job
>>> 13/02/11 17:04:47 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:47 INFO mapred.Task: Task 'attempt_local_0001_m_000029_0'
>>> done.
>>> 13/02/11 17:04:47 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@6760bf50
>>> 13/02/11 17:04:50 INFO mapred.Task: Task:attempt_local_0001_m_000030_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:50 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:50 INFO mapred.Task: Task attempt_local_0001_m_000030_0
>>> is allowed to commit now
>>> 13/02/11 17:04:50 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000030_0' to temp/Q-job
>>> 13/02/11 17:04:50 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:50 INFO mapred.Task: Task 'attempt_local_0001_m_000030_0'
>>> done.
>>> 13/02/11 17:04:50 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@a2bccb2
>>> 13/02/11 17:04:53 INFO mapred.Task: Task:attempt_local_0001_m_000031_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:53 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:53 INFO mapred.Task: Task attempt_local_0001_m_000031_0
>>> is allowed to commit now
>>> 13/02/11 17:04:53 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000031_0' to temp/Q-job
>>> 13/02/11 17:04:53 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:53 INFO mapred.Task: Task 'attempt_local_0001_m_000031_0'
>>> done.
>>> 13/02/11 17:04:53 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@4ca68fd8
>>> 13/02/11 17:04:56 INFO mapred.Task: Task:attempt_local_0001_m_000032_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:56 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:56 INFO mapred.Task: Task attempt_local_0001_m_000032_0
>>> is allowed to commit now
>>> 13/02/11 17:04:56 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000032_0' to temp/Q-job
>>> 13/02/11 17:04:56 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:56 INFO mapred.Task: Task 'attempt_local_0001_m_000032_0'
>>> done.
>>> 13/02/11 17:04:56 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@6bab600f
>>> 13/02/11 17:04:59 INFO mapred.Task: Task:attempt_local_0001_m_000033_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:04:59 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:59 INFO mapred.Task: Task attempt_local_0001_m_000033_0
>>> is allowed to commit now
>>> 13/02/11 17:04:59 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000033_0' to temp/Q-job
>>> 13/02/11 17:04:59 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:04:59 INFO mapred.Task: Task 'attempt_local_0001_m_000033_0'
>>> done.
>>> 13/02/11 17:04:59 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@395d601f
>>> 13/02/11 17:05:02 INFO mapred.Task: Task:attempt_local_0001_m_000034_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:02 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:02 INFO mapred.Task: Task attempt_local_0001_m_000034_0
>>> is allowed to commit now
>>> 13/02/11 17:05:02 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000034_0' to temp/Q-job
>>> 13/02/11 17:05:02 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:02 INFO mapred.Task: Task 'attempt_local_0001_m_000034_0'
>>> done.
>>> 13/02/11 17:05:02 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@2cc22a3b
>>> 13/02/11 17:05:05 INFO mapred.Task: Task:attempt_local_0001_m_000035_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:05 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:05 INFO mapred.Task: Task attempt_local_0001_m_000035_0
>>> is allowed to commit now
>>> 13/02/11 17:05:05 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000035_0' to temp/Q-job
>>> 13/02/11 17:05:05 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:05 INFO mapred.Task: Task 'attempt_local_0001_m_000035_0'
>>> done.
>>> 13/02/11 17:05:05 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@7ae0c7c3
>>> 13/02/11 17:05:08 INFO mapred.Task: Task:attempt_local_0001_m_000036_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:08 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:08 INFO mapred.Task: Task attempt_local_0001_m_000036_0
>>> is allowed to commit now
>>> 13/02/11 17:05:08 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000036_0' to temp/Q-job
>>> 13/02/11 17:05:08 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:08 INFO mapred.Task: Task 'attempt_local_0001_m_000036_0'
>>> done.
>>> 13/02/11 17:05:08 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@6ac67a88
>>> 13/02/11 17:05:11 INFO mapred.Task: Task:attempt_local_0001_m_000037_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:11 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:11 INFO mapred.Task: Task attempt_local_0001_m_000037_0
>>> is allowed to commit now
>>> 13/02/11 17:05:11 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000037_0' to temp/Q-job
>>> 13/02/11 17:05:11 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:11 INFO mapred.Task: Task 'attempt_local_0001_m_000037_0'
>>> done.
>>> 13/02/11 17:05:11 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@30d647d8
>>> 13/02/11 17:05:14 INFO mapred.Task: Task:attempt_local_0001_m_000038_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:14 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:14 INFO mapred.Task: Task attempt_local_0001_m_000038_0
>>> is allowed to commit now
>>> 13/02/11 17:05:14 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000038_0' to temp/Q-job
>>> 13/02/11 17:05:14 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:14 INFO mapred.Task: Task 'attempt_local_0001_m_000038_0'
>>> done.
>>> 13/02/11 17:05:14 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@77b5c22f
>>> 13/02/11 17:05:17 INFO mapred.Task: Task:attempt_local_0001_m_000039_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:17 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:17 INFO mapred.Task: Task attempt_local_0001_m_000039_0
>>> is allowed to commit now
>>> 13/02/11 17:05:17 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000039_0' to temp/Q-job
>>> 13/02/11 17:05:17 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:17 INFO mapred.Task: Task 'attempt_local_0001_m_000039_0'
>>> done.
>>> 13/02/11 17:05:17 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@4c2d0479
>>> 13/02/11 17:05:19 INFO mapred.Task: Task:attempt_local_0001_m_000040_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:19 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:19 INFO mapred.Task: Task attempt_local_0001_m_000040_0
>>> is allowed to commit now
>>> 13/02/11 17:05:19 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000040_0' to temp/Q-job
>>> 13/02/11 17:05:19 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:19 INFO mapred.Task: Task 'attempt_local_0001_m_000040_0'
>>> done.
>>> 13/02/11 17:05:19 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@64d22462
>>> 13/02/11 17:05:22 INFO mapred.Task: Task:attempt_local_0001_m_000041_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:22 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:22 INFO mapred.Task: Task attempt_local_0001_m_000041_0
>>> is allowed to commit now
>>> 13/02/11 17:05:22 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000041_0' to temp/Q-job
>>> 13/02/11 17:05:22 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:22 INFO mapred.Task: Task 'attempt_local_0001_m_000041_0'
>>> done.
>>> 13/02/11 17:05:22 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1f758cd1
>>> 13/02/11 17:05:25 INFO mapred.Task: Task:attempt_local_0001_m_000042_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:25 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:25 INFO mapred.Task: Task attempt_local_0001_m_000042_0
>>> is allowed to commit now
>>> 13/02/11 17:05:25 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000042_0' to temp/Q-job
>>> 13/02/11 17:05:25 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:25 INFO mapred.Task: Task 'attempt_local_0001_m_000042_0'
>>> done.
>>> 13/02/11 17:05:25 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1e1ec86
>>> 13/02/11 17:05:28 INFO mapred.Task: Task:attempt_local_0001_m_000043_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:28 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:28 INFO mapred.Task: Task attempt_local_0001_m_000043_0
>>> is allowed to commit now
>>> 13/02/11 17:05:28 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000043_0' to temp/Q-job
>>> 13/02/11 17:05:28 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:28 INFO mapred.Task: Task 'attempt_local_0001_m_000043_0'
>>> done.
>>> 13/02/11 17:05:28 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@69f94884
>>> 13/02/11 17:05:31 INFO mapred.Task: Task:attempt_local_0001_m_000044_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:31 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:31 INFO mapred.Task: Task attempt_local_0001_m_000044_0
>>> is allowed to commit now
>>> 13/02/11 17:05:31 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000044_0' to temp/Q-job
>>> 13/02/11 17:05:31 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:31 INFO mapred.Task: Task 'attempt_local_0001_m_000044_0'
>>> done.
>>> 13/02/11 17:05:31 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@488ddb93
>>> 13/02/11 17:05:34 INFO mapred.Task: Task:attempt_local_0001_m_000045_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:34 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:34 INFO mapred.Task: Task attempt_local_0001_m_000045_0
>>> is allowed to commit now
>>> 13/02/11 17:05:34 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000045_0' to temp/Q-job
>>> 13/02/11 17:05:34 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:34 INFO mapred.Task: Task 'attempt_local_0001_m_000045_0'
>>> done.
>>> 13/02/11 17:05:34 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@52c4c57
>>> 13/02/11 17:05:37 INFO mapred.Task: Task:attempt_local_0001_m_000046_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:37 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:37 INFO mapred.Task: Task attempt_local_0001_m_000046_0
>>> is allowed to commit now
>>> 13/02/11 17:05:37 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000046_0' to temp/Q-job
>>> 13/02/11 17:05:37 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:37 INFO mapred.Task: Task 'attempt_local_0001_m_000046_0'
>>> done.
>>> 13/02/11 17:05:37 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@5d185844
>>> 13/02/11 17:05:40 INFO mapred.Task: Task:attempt_local_0001_m_000047_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:40 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:40 INFO mapred.Task: Task attempt_local_0001_m_000047_0
>>> is allowed to commit now
>>> 13/02/11 17:05:40 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000047_0' to temp/Q-job
>>> 13/02/11 17:05:40 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:40 INFO mapred.Task: Task 'attempt_local_0001_m_000047_0'
>>> done.
>>> 13/02/11 17:05:40 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@66de609
>>> 13/02/11 17:05:43 INFO mapred.Task: Task:attempt_local_0001_m_000048_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:43 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:43 INFO mapred.Task: Task attempt_local_0001_m_000048_0
>>> is allowed to commit now
>>> 13/02/11 17:05:43 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000048_0' to temp/Q-job
>>> 13/02/11 17:05:43 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:43 INFO mapred.Task: Task 'attempt_local_0001_m_000048_0'
>>> done.
>>> 13/02/11 17:05:43 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@59050a0
>>> 13/02/11 17:05:46 INFO mapred.Task: Task:attempt_local_0001_m_000049_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:46 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:46 INFO mapred.Task: Task attempt_local_0001_m_000049_0
>>> is allowed to commit now
>>> 13/02/11 17:05:46 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000049_0' to temp/Q-job
>>> 13/02/11 17:05:46 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:46 INFO mapred.Task: Task 'attempt_local_0001_m_000049_0'
>>> done.
>>> 13/02/11 17:05:46 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@56a6cbf7
>>> 13/02/11 17:05:49 INFO mapred.Task: Task:attempt_local_0001_m_000050_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:49 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:49 INFO mapred.Task: Task attempt_local_0001_m_000050_0
>>> is allowed to commit now
>>> 13/02/11 17:05:49 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000050_0' to temp/Q-job
>>> 13/02/11 17:05:49 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:49 INFO mapred.Task: Task 'attempt_local_0001_m_000050_0'
>>> done.
>>> 13/02/11 17:05:49 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@5c5ddd3
>>> 13/02/11 17:05:52 INFO mapred.Task: Task:attempt_local_0001_m_000051_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:52 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:52 INFO mapred.Task: Task attempt_local_0001_m_000051_0
>>> is allowed to commit now
>>> 13/02/11 17:05:52 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000051_0' to temp/Q-job
>>> 13/02/11 17:05:52 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:52 INFO mapred.Task: Task 'attempt_local_0001_m_000051_0'
>>> done.
>>> 13/02/11 17:05:52 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@5ae8a0cd
>>> 13/02/11 17:05:55 INFO mapred.Task: Task:attempt_local_0001_m_000052_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:55 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:55 INFO mapred.Task: Task attempt_local_0001_m_000052_0
>>> is allowed to commit now
>>> 13/02/11 17:05:55 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000052_0' to temp/Q-job
>>> 13/02/11 17:05:55 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:55 INFO mapred.Task: Task 'attempt_local_0001_m_000052_0'
>>> done.
>>> 13/02/11 17:05:55 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@4c6127da
>>> 13/02/11 17:05:58 INFO mapred.Task: Task:attempt_local_0001_m_000053_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:05:58 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:58 INFO mapred.Task: Task attempt_local_0001_m_000053_0
>>> is allowed to commit now
>>> 13/02/11 17:05:58 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000053_0' to temp/Q-job
>>> 13/02/11 17:05:58 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:05:58 INFO mapred.Task: Task 'attempt_local_0001_m_000053_0'
>>> done.
>>> 13/02/11 17:05:58 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@2e8932e8
>>> 13/02/11 17:06:01 INFO mapred.Task: Task:attempt_local_0001_m_000054_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:06:01 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:06:01 INFO mapred.Task: Task attempt_local_0001_m_000054_0
>>> is allowed to commit now
>>> 13/02/11 17:06:01 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000054_0' to temp/Q-job
>>> 13/02/11 17:06:01 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:06:01 INFO mapred.Task: Task 'attempt_local_0001_m_000054_0'
>>> done.
>>> 13/02/11 17:06:01 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@644ca6b6
>>> 13/02/11 17:06:04 INFO mapred.Task: Task:attempt_local_0001_m_000055_0
>>> is done. And is in the process of commiting
>>> 13/02/11 17:06:04 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:06:04 INFO mapred.Task: Task attempt_local_0001_m_000055_0
>>> is allowed to commit now
>>> 13/02/11 17:06:04 INFO output.FileOutputCommitter: Saved output of
>>> task 'attempt_local_0001_m_000055_0' to temp/Q-job
>>> 13/02/11 17:06:04 INFO mapred.LocalJobRunner:
>>> 13/02/11 17:06:04 INFO mapred.Task: Task 'attempt_local_0001_m_000055_0'
>>> done.
>>> 13/02/11 17:06:04 INFO mapred.Task:  Using ResourceCalculatorPlugin :
>>> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1ff243d1
>>> 13/02/11 17:06:05 ERROR common.IOUtils: new m can't be less than n
>>> java.lang.IllegalArgumentException: new m can't be less than n
>>>         at
>>> org.apache.mahout.math.hadoop.stochasticsvd.qr.GivensThinSolver.adjust(GivensThinSolver.java:109)
>>>         at
>>> org.apache.mahout.math.hadoop.stochasticsvd.qr.QRFirstStep.cleanup(QRFirstStep.java:233)
>>>         at
>>> org.apache.mahout.math.hadoop.stochasticsvd.qr.QRFirstStep.close(QRFirstStep.java:89)
>>>         at org.apache.mahout.common.IOUtils.close(IOUtils.java:128)
>>>         at
>>> org.apache.mahout.math.hadoop.stochasticsvd.QJob$QMapper.cleanup(QJob.java:158)
>>>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
>>>         at
>>> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>         at
>>> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212)
>>> 13/02/11 17:06:05 WARN mapred.LocalJobRunner: job_local_0001
>>> java.lang.IllegalArgumentException: new m can't be less than n
>>>         at
>>> org.apache.mahout.math.hadoop.stochasticsvd.qr.GivensThinSolver.adjust(GivensThinSolver.java:109)
>>>         at
>>> org.apache.mahout.math.hadoop.stochasticsvd.qr.QRFirstStep.cleanup(QRFirstStep.java:233)
>>>         at
>>> org.apache.mahout.math.hadoop.stochasticsvd.qr.QRFirstStep.close(QRFirstStep.java:89)
>>>         at org.apache.mahout.common.IOUtils.close(IOUtils.java:128)
>>>         at
>>> org.apache.mahout.math.hadoop.stochasticsvd.QJob$QMapper.cleanup(QJob.java:158)
>>>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
>>>         at
>>> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>         at
>>> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212)
>>> Exception in thread "main" java.io.IOException: Q job unsuccessful.
>>>         at
>>> org.apache.mahout.math.hadoop.stochasticsvd.QJob.run(QJob.java:230)
>>>         at
>>> org.apache.mahout.math.hadoop.stochasticsvd.SSVDSolver.run(SSVDSolver.java:377)
>>>         at
>>> org.apache.mahout.math.hadoop.stochasticsvd.SSVDCli.run(SSVDCli.java:141)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>>         at
>>> org.apache.mahout.math.hadoop.stochasticsvd.SSVDCli.main(SSVDCli.java:171)
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>         at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>         at
>>> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>>>         at
>>> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>>>         at
>>> org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195)
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>         at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>>> -end mahout output]
>>>
>>
>>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message