mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rodolfo Viana <rodolfodelimavi...@gmail.com>
Subject Re: Mahout 0.10 with Spark 1.1.1
Date Tue, 14 Jul 2015 18:55:41 GMT
I just realize that I have to export this variable, passing the
HADOOP_HOME.

Like this for example: export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop

After this modification everything works fine.

On Wed, Jul 8, 2015 at 7:10 AM, Kidong Lee <mykidong@gmail.com> wrote:

> I have experienced submitting mahout spark job with yarn-client mode like
> this:
>
> bin/mahout spark-itemsimilarity --input /input/part-000 --output
> /output --maxSimilaritiesPerItem 20 --master yarn-client
> --sparkExecutorMem 8g -D:spark.driver.memory=5g
> -D:spark.driver.maxResultSize=3g -D:spark.executor.instances=4
> -D:spark.executor.cores=4 -D:spark.yarn.queue=spark-prod
>
>
> It worked fine for me.
>
>
> - Kidong.
>
>
>
> 2015-07-08 3:34 GMT+09:00 Rodolfo Viana <rodolfodelimaviana@gmail.com>:
>
> > Hi,
> >
> > I’m trying to run Mahout 0.10 using Spark 1.1.1 and so far I didn’t have
> > any success passing a file on hdfs. My actual problem is when I try to
> run
> > the example:
> >
> > bin/mahout spark-itemsimilarity --input hdfs://localhost:9000/input
> > --output hdfs://localhost:9000/output
> >
> > And I’m having this
> > <
> >
> https://drive.google.com/file/d/0BwqKhM_BnSmgcUVzRm1odzhBQk0/view?usp=sharing
> > >
> >
> > I was googling and I found this solution
> >
> > Configuration configuration = new Configuration();
> >
> > FileSystem hdfsFileSystem = FileSystem.get(new URI("hdfs://localhost:9000
> > "),configuration);
> >
> >
> >
> http://techidiocy.com/java-lang-illegalargumentexception-wrong-fs-expected-file/
> >
> > but I don’t want modify the original code.
> >
> > Is there any way that I can resolve this problem without have to modify
> the
> > code?
> >
> >
> >
> >
> > On Tue, Jul 7, 2015 at 3:28 PM, Dmitriy Lyubimov <dlieu.7@gmail.com>
> > wrote:
> >
> > > attachments are not showing up on apache lists.
> > >
> > >
> > > On Tue, Jul 7, 2015 at 10:30 AM, Rodolfo Viana <
> > > rodolfodelimaviana@gmail.com
> > > > wrote:
> > >
> > > > Hi,
> > > >
> > > > I’m trying to run Mahout 0.10 using Spark 1.1.1 and so far I didn’t
> > have
> > > > any success passing a file on hdfs. My actual problem is when I try
> to
> > > run
> > > > the example:
> > > >
> > > > bin/mahout spark-itemsimilarity --input hdfs://localhost:9000/input
> > > > --output hdfs://localhost:9000/output
> > > >
> > > > And I’m having this error: (attach)
> > > >
> > > >
> > > >
> > > > I was googling and I found this solution
> > > >
> > > > Configuration configuration = new Configuration();
> > > >
> > > > FileSystem hdfsFileSystem = FileSystem.get(new
> > URI("hdfs://localhost:9000
> > > > "),configuration);
> > > >
> > > >
> > > >
> > >
> >
> http://techidiocy.com/java-lang-illegalargumentexception-wrong-fs-expected-file/
> > > >
> > > > but I don’t want modify the original code.
> > > >
> > > > Is there any way that I can resolve this problem without have to
> modify
> > > > the code?
> > > >
> > > > --
> > > > Rodolfo de Lima Viana
> > > > Undergraduate in Computer Science at UFCG
> > > >
> > > >
> > >
> >
> >
> >
> > --
> > Rodolfo de Lima Viana
> > Undergraduate in Computer Science at UFCG
> >
>



-- 
Rodolfo de Lima Viana
Undergraduate in Computer Science at UFCG

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message