mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From FRANCISCO XAVIER SUMBA TORAL <xavier.sumb...@ucuenca.ec>
Subject Re: Mahout out put to plot graph
Date Fri, 05 Feb 2016 15:04:11 GMT
Hello, 

Could you make your graphics in R? If you can could you explain which steps did you follow?

Cheers,
XS.

> On Feb 2, 2016, at 11:25, Suet Lam Felix CHUNG <slfchung@gmail.com> wrote:
> 
> I use Mahout to run various algorithms,e.g. kmeans, and then I would like
> to use the result to plot graph. Im using R as my graph plotting tool. I
> use the seqsdumper and export format as graph_ml. However, the output file
> (graph ml ) contains difference result from csv output.r also cannot plot
> the result.
> 
> My question Anyway to plot the Mahout result by r
> 2016/2/3 上午12:22於 "BahaaEddin AlAila" <bahaelaila7@gmail.com>寫道:
> 
>> Greetings mahout users,
>> 
>> I have been trying to use mahout samsara as a library with scala/spark, but
>> I haven't been successful in doing so.
>> 
>> I am running spark 1.6.0 binaries, didn't build it myself.
>> However, I tried both readily available binaries on Apache mirrors, and
>> cloning and compiling mahout's repo, but neither worked.
>> 
>> I keep getting
>> 
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> org/apache/mahout/sparkbindings/SparkDistributedContext
>> 
>> The way I am doing things is:
>> I have spark in ~/spark-1.6
>> and mahout in ~/mahout
>> I have set both $SPARK_HOME and $MAHOUT_HOME accordingly, along with
>> $MAHOUT_LOCAL=true
>> 
>> and I have:
>> 
>> ~/app1/build.sbt
>> ~/app1/src/main/scala/App1.scala
>> 
>> in build.sbt I have these lines to declare mahout dependecies:
>> 
>> libraryDependencies += "org.apache.mahout" %% "mahout-math-scala" %
>> "0.11.1"
>> 
>> libraryDependencies += "org.apache.mahout" % "mahout-math" % "0.11.1"
>> 
>> libraryDependencies += "org.apache.mahout" % "mahout-spark_2.10" % "0.11.1"
>> 
>> along with other spark dependencies
>> 
>> and in App1.scala, in the main function, I construct a context object using
>> mahoutSparkContext, and of course, the sparkbindings are imported
>> 
>> everything compiles successfully
>> 
>> however, when I submit to spark, I get the above mentioned error.
>> 
>> I have a general idea of why this is happening: because the compiled app1
>> jar depends on mahout-spark dependency jar but it cannot find it in the
>> class path upon being submitted to spark.
>> 
>> In the instructions I couldn't find how to explicitly add the mahout-spark
>> dependency jar to the class path.
>> 
>> The question is: Am I doing the configurations correctly or not?
>> 
>> Sorry for the lengthy email
>> 
>> Kind Regards,
>> Bahaa
>> 


Mime
View raw message