mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dmitriy Lyubimov <>
Subject Re: The perennial "Error: java.lang.ClassNotFoundException: org.apache.mahout.math.Vector" problem
Date Mon, 09 May 2011 20:44:43 GMT
On Mon, May 9, 2011 at 1:39 PM, Jake Mannix <> wrote:
> On Mon, May 9, 2011 at 1:31 PM, Dmitriy Lyubimov <> wrote:
>> then AbstractJob implements walking the lib tree and adding those
>> paths (based on MAHOUT_HOME
>> or otherwise derived knowledge of lib location) and throws all the
>> jars there into backend path. all mahout projects
>> do something similar. Where's the complexity in that?
> The complexity is right there: "throws all jars there into backend path".
> How do you wish to accomplish this?  Currently we follow the hadoop
> convention of doing this (lib/ inside of the jar passed to hadoop cli).
> It apparently doesn't always work (or never?  or is this PEBKAC?).
> We could alternately use the hadoop "-libjars" technique, which
> does what you suggest in another way.  Also we could, ourself,
> copy these jars into the DistributedCache and do something that
> way.

Yes. the latter. 2 line loop over $MAHOUT_HOME/lib with one call to
distrubted cache inside.

> I really wish I knew why the lib/ thing doesn't work for vanilla
> calls to classes in our examples job-jar.

Yes. Exactly. That's the problem with 'conventions' vs, specs.
They are they are not specs and since are subject to change on a whim.
And they are not api, either, so you can't validate contract
(to a certain degree only, of course) at compile time.

View raw message