spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 尹绪森 <yinxu...@gmail.com>
Subject Re: Cannot get Hadoop dependencies
Date Mon, 27 Jan 2014 14:20:25 GMT
http://www.scala-sbt.org/release/docs/Getting-Started/Library-Dependencies

This document might be useful. You should make sure that your specified
package in the right uri,and the repo is added in resolver.
2014-1-27 PM9:24于 "Kal El" <pinu.datriciu@yahoo.com>写道:

> I am having some trouble with Hadoop. I cannot build my project with sbt.
>
> According to the documentation, I added a line like this in my build.sbt
> file:
> "libraryDependencies += "org.apache.hadoop" % "hadoop-client" %
> "<your-hdfs-version>""
> my line being:
>
> libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "0.20.2"
>
> when I hit assembly under sbt I get the following error:
>
> > assembly
> [info] Updating {file:/home/spark2013/clusterWorkDirectory/GC/}gc...
> [info] Resolving org.apache.hadoop#hadoop-client;0.20.2 ...
> [warn]  module not found: org.apache.hadoop#hadoop-client;0.20.2
> [warn] ==== local: tried
> [warn]
> /home/spark2013/.ivy2/local/org.apache.hadoop/hadoop-client/0.20.2/ivys/ivy.xml
> [warn] ==== public: tried
> [warn]
> http://repo1.maven.org/maven2/org/apache/hadoop/hadoop-client/0.20.2/hadoop-client-0.20.2.pom
> [info] Resolving org.fusesource.jansi#jansi;1.4 ...
> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
> [warn]  ::          UNRESOLVED DEPENDENCIES         ::
> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
> [warn]  :: org.apache.hadoop#hadoop-client;0.20.2: not found
> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
> [trace] Stack trace suppressed: run last *:update for the full output.
> [error] (*:update) sbt.ResolveException: unresolved dependency:
> org.apache.hadoop#hadoop-client;0.20.2: not found
> [error] Total time: 4 s, completed Jan 27, 2014 3:21:28 PM
>
> How can I fix this ?
>
> Thanks
>

Mime
View raw message