mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From BahaaEddin AlAila <bahaelai...@gmail.com>
Subject Re: Confusion regarding Samsara's configuration
Date Tue, 02 Feb 2016 18:24:28 GMT
yes, 0.11.1

On Tue, Feb 2, 2016 at 12:24 PM, Suneel Marthi <suneel.marthi@gmail.com>
wrote:

> Are u working off of Mahout 0.11.1 ? 0.11.1 has been certified for Spark
> 1.5 but compatible with 1.6.
>
>
> On Tue, Feb 2, 2016 at 12:10 PM, BahaaEddin AlAila <bahaelaila7@gmail.com>
> wrote:
>
> > Thank you very much for your reply.
> > As I mentioned earlier, I am using mahoutSparkContext, and MAHOUT_HOME is
> > set to the correct mahout path.
> > I also have tried setting up the context myself as I looked into the
> > implementation of mahoutSparkContext and supplied the jars path manually.
> > still the same error.
> > I will try with spark 1.5 and report.
> >
> > Thank you very much again,
> >
> > Kind Regards,
> > Bahaa
> >
> >
> > On Tue, Feb 2, 2016 at 12:01 PM, Dmitriy Lyubimov <dlieu.7@gmail.com>
> > wrote:
> >
> > > Bahaa, first off, i don't think we have certified any of releases to
> run
> > > with spar 1.6 (yet). I think spark 1.5 is the last known release to run
> > > with 0.11 series.
> > >
> > > Second, if you use mahoutSparkContext() method to create context, it
> > would
> > > look for MAHOUT_HOME setup to add mahout binaries to the job. So the
> > > reasons you may not getting it is perhaps you are not using the
> > > mahoutCreateContext()?
> > >
> > > alternatively, you can create context yourself, but you need (1) make
> > sure
> > > it has enabled and configured Kryo serialization properly, and (2) have
> > > added all necessary mahout jars on your own.
> > >
> > > -d
> > >
> > > On Tue, Feb 2, 2016 at 8:22 AM, BahaaEddin AlAila <
> bahaelaila7@gmail.com
> > >
> > > wrote:
> > >
> > > > Greetings mahout users,
> > > >
> > > > I have been trying to use mahout samsara as a library with
> scala/spark,
> > > but
> > > > I haven't been successful in doing so.
> > > >
> > > > I am running spark 1.6.0 binaries, didn't build it myself.
> > > > However, I tried both readily available binaries on Apache mirrors,
> and
> > > > cloning and compiling mahout's repo, but neither worked.
> > > >
> > > > I keep getting
> > > >
> > > > Exception in thread "main" java.lang.NoClassDefFoundError:
> > > > org/apache/mahout/sparkbindings/SparkDistributedContext
> > > >
> > > > The way I am doing things is:
> > > > I have spark in ~/spark-1.6
> > > > and mahout in ~/mahout
> > > > I have set both $SPARK_HOME and $MAHOUT_HOME accordingly, along with
> > > > $MAHOUT_LOCAL=true
> > > >
> > > > and I have:
> > > >
> > > > ~/app1/build.sbt
> > > > ~/app1/src/main/scala/App1.scala
> > > >
> > > > in build.sbt I have these lines to declare mahout dependecies:
> > > >
> > > > libraryDependencies += "org.apache.mahout" %% "mahout-math-scala" %
> > > > "0.11.1"
> > > >
> > > > libraryDependencies += "org.apache.mahout" % "mahout-math" % "0.11.1"
> > > >
> > > > libraryDependencies += "org.apache.mahout" % "mahout-spark_2.10" %
> > > "0.11.1"
> > > >
> > > > along with other spark dependencies
> > > >
> > > > and in App1.scala, in the main function, I construct a context object
> > > using
> > > > mahoutSparkContext, and of course, the sparkbindings are imported
> > > >
> > > > everything compiles successfully
> > > >
> > > > however, when I submit to spark, I get the above mentioned error.
> > > >
> > > > I have a general idea of why this is happening: because the compiled
> > app1
> > > > jar depends on mahout-spark dependency jar but it cannot find it in
> the
> > > > class path upon being submitted to spark.
> > > >
> > > > In the instructions I couldn't find how to explicitly add the
> > > mahout-spark
> > > > dependency jar to the class path.
> > > >
> > > > The question is: Am I doing the configurations correctly or not?
> > > >
> > > > Sorry for the lengthy email
> > > >
> > > > Kind Regards,
> > > > Bahaa
> > > >
> > >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message