spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shrikar archak <shrika...@gmail.com>
Subject Re: Unable to run a Standalone job([NOT FOUND ] org.eclipse.jetty.orbit#javax.mail.glassfish;1.4.1.v201005082020)
Date Thu, 05 Jun 2014 15:05:50 GMT
Hi Prabeesh/ Sean,

I tried both the steps you guys mentioned looks like its not able to
resolve it.

[warn] [NOT FOUND  ]
org.eclipse.jetty.orbit#javax.transaction;1.1.1.v201105210645!javax.transaction.orbit
(131ms)
[warn] ==== public: tried
[warn]
http://repo1.maven.org/maven2/org/eclipse/jetty/orbit/javax.transaction/1.1.1.v201105210645/javax.transaction-1.1.1.v201105210645.orbit
[warn] [NOT FOUND  ]
org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016!javax.servlet.orbit
(225ms)
[warn] ==== public: tried
[warn]
http://repo1.maven.org/maven2/org/eclipse/jetty/orbit/javax.servlet/3.0.0.v201112011016/javax.servlet-3.0.0.v201112011016.orbit
[warn] [NOT FOUND  ]
org.eclipse.jetty.orbit#javax.mail.glassfish;1.4.1.v201005082020!javax.mail.glassfish.orbit
(214ms)
[warn] ==== public: tried
[warn]
http://repo1.maven.org/maven2/org/eclipse/jetty/orbit/javax.mail.glassfish/1.4.1.v201005082020/javax.mail.glassfish-1.4.1.v201005082020.orbit
[warn] [NOT FOUND  ]
org.eclipse.jetty.orbit#javax.activation;1.1.0.v201105071233!javax.activation.orbit
(112ms)
[warn] ==== public: tried

Thanks,
Shrikar


On Thu, Jun 5, 2014 at 1:27 AM, prabeesh k <prabsmails@gmail.com> wrote:

> try sbt clean command before build the app.
>
> or delete .ivy2 ans .sbt  folders(not a good methode). Then try to rebuild
> the project.
>
>
> On Thu, Jun 5, 2014 at 11:45 AM, Sean Owen <sowen@cloudera.com> wrote:
>
>> I think this is SPARK-1949 again:
>> https://github.com/apache/spark/pull/906
>> I think this change fixed this issue for a few people using the SBT
>> build, worth committing?
>>
>> On Thu, Jun 5, 2014 at 6:40 AM, Shrikar archak <shrikar84@gmail.com>
>> wrote:
>> > Hi All,
>> > Now that the Spark Version 1.0.0 is release there should not be any
>> problem
>> > with the local jars.
>> > Shrikars-MacBook-Pro:SimpleJob shrikar$ cat simple.sbt
>> > name := "Simple Project"
>> >
>> > version := "1.0"
>> >
>> > scalaVersion := "2.10.4"
>> >
>> > libraryDependencies ++= Seq("org.apache.spark" %% "spark-core" %
>> "1.0.0",
>> >                             "org.apache.spark" %% "spark-streaming" %
>> > "1.0.0")
>> >
>> > resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
>> >
>> > I am still having this issue
>> > [error] (run-main) java.lang.NoClassDefFoundError:
>> > javax/servlet/http/HttpServletResponse
>> > java.lang.NoClassDefFoundError: javax/servlet/http/HttpServletResponse
>> > at org.apache.spark.HttpServer.start(HttpServer.scala:54)
>> > at
>> >
>> org.apache.spark.broadcast.HttpBroadcast$.createServer(HttpBroadcast.scala:156)
>> > at
>> >
>> org.apache.spark.broadcast.HttpBroadcast$.initialize(HttpBroadcast.scala:127)
>> > at
>> >
>> org.apache.spark.broadcast.HttpBroadcastFactory.initialize(HttpBroadcastFactory.scala:31)
>> > at
>> >
>> org.apache.spark.broadcast.BroadcastManager.initialize(BroadcastManager.scala:48)
>> > at
>> >
>> org.apache.spark.broadcast.BroadcastManager.<init>(BroadcastManager.scala:35)
>> > at org.apache.spark.SparkEnv$.create(SparkEnv.scala:218)
>> > at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
>> >
>> > Any help would be greatly appreciated.
>> >
>> > Thanks,
>> > Shrikar
>> >
>> >
>> > On Fri, May 23, 2014 at 3:58 PM, Shrikar archak <shrikar84@gmail.com>
>> wrote:
>> >>
>> >> Still the same error no change
>> >>
>> >> Thanks,
>> >> Shrikar
>> >>
>> >>
>> >> On Fri, May 23, 2014 at 2:38 PM, Jacek Laskowski <jacek@japila.pl>
>> wrote:
>> >>>
>> >>> Hi Shrikar,
>> >>>
>> >>> How did you build Spark 1.0.0-SNAPSHOT on your machine? My
>> >>> understanding is that `sbt publishLocal` is not enough and you really
>> >>> need `sbt assembly` instead. Give it a try and report back.
>> >>>
>> >>> As to your build.sbt, upgrade Scala to 2.10.4 and "org.apache.spark"
>> >>> %% "spark-streaming" % "1.0.0-SNAPSHOT" only that will pull down
>> >>> spark-core as a transitive dep. The resolver for Akka Repository is
>> >>> not needed. Your build.sbt should really look as follows:
>> >>>
>> >>> name := "Simple Project"
>> >>>
>> >>> version := "1.0"
>> >>>
>> >>> scalaVersion := "2.10.4"
>> >>>
>> >>> libraryDependencies += "org.apache.spark" %% "spark-streaming" %
>> >>> "1.0.0-SNAPSHOT"
>> >>>
>> >>> Jacek
>> >>>
>> >>> On Thu, May 22, 2014 at 11:27 PM, Shrikar archak <shrikar84@gmail.com
>> >
>> >>> wrote:
>> >>> > Hi All,
>> >>> >
>> >>> > I am trying to run the network count example as a seperate
>> standalone
>> >>> > job
>> >>> > and running into some issues.
>> >>> >
>> >>> > Environment:
>> >>> > 1) Mac Mavericks
>> >>> > 2) Latest spark repo from Github.
>> >>> >
>> >>> >
>> >>> > I have a structure like this
>> >>> >
>> >>> > Shrikars-MacBook-Pro:SimpleJob shrikar$ find .
>> >>> > .
>> >>> > ./simple.sbt
>> >>> > ./src
>> >>> > ./src/main
>> >>> > ./src/main/scala
>> >>> > ./src/main/scala/NetworkWordCount.scala
>> >>> > ./src/main/scala/SimpleApp.scala.bk
>> >>> >
>> >>> >
>> >>> > simple.sbt
>> >>> > name := "Simple Project"
>> >>> >
>> >>> > version := "1.0"
>> >>> >
>> >>> > scalaVersion := "2.10.3"
>> >>> >
>> >>> > libraryDependencies ++= Seq("org.apache.spark" %% "spark-core"
%
>> >>> > "1.0.0-SNAPSHOT",
>> >>> >                             "org.apache.spark" %% "spark-streaming"
>> %
>> >>> > "1.0.0-SNAPSHOT")
>> >>> >
>> >>> > resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
>> >>> >
>> >>> >
>> >>> > I am able to run the SimpleApp which is mentioned in the doc but
>> when I
>> >>> > try
>> >>> > to run the NetworkWordCount app I get error like this am I missing
>> >>> > something?
>> >>> >
>> >>> > [info] Running com.shrikar.sparkapps.NetworkWordCount
>> >>> > 14/05/22 14:26:47 INFO spark.SecurityManager: Changing view acls
to:
>> >>> > shrikar
>> >>> > 14/05/22 14:26:47 INFO spark.SecurityManager: SecurityManager:
>> >>> > authentication disabled; ui acls disabled; users with view
>> permissions:
>> >>> > Set(shrikar)
>> >>> > 14/05/22 14:26:48 INFO slf4j.Slf4jLogger: Slf4jLogger started
>> >>> > 14/05/22 14:26:48 INFO Remoting: Starting remoting
>> >>> > 14/05/22 14:26:48 INFO Remoting: Remoting started; listening on
>> >>> > addresses
>> >>> > :[akka.tcp://spark@192.168.10.88:49963]
>> >>> > 14/05/22 14:26:48 INFO Remoting: Remoting now listens on addresses:
>> >>> > [akka.tcp://spark@192.168.10.88:49963]
>> >>> > 14/05/22 14:26:48 INFO spark.SparkEnv: Registering MapOutputTracker
>> >>> > 14/05/22 14:26:48 INFO spark.SparkEnv: Registering
>> BlockManagerMaster
>> >>> > 14/05/22 14:26:48 INFO storage.DiskBlockManager: Created local
>> >>> > directory at
>> >>> >
>> >>> >
>> /var/folders/r2/mbj08pb55n5d_9p8588xk5b00000gn/T/spark-local-20140522142648-0a14
>> >>> > 14/05/22 14:26:48 INFO storage.MemoryStore: MemoryStore started
with
>> >>> > capacity 911.6 MB.
>> >>> > 14/05/22 14:26:48 INFO network.ConnectionManager: Bound socket
to
>> port
>> >>> > 49964
>> >>> > with id = ConnectionManagerId(192.168.10.88,49964)
>> >>> > 14/05/22 14:26:48 INFO storage.BlockManagerMaster: Trying to
>> register
>> >>> > BlockManager
>> >>> > 14/05/22 14:26:48 INFO storage.BlockManagerInfo: Registering block
>> >>> > manager
>> >>> > 192.168.10.88:49964 with 911.6 MB RAM
>> >>> > 14/05/22 14:26:48 INFO storage.BlockManagerMaster: Registered
>> >>> > BlockManager
>> >>> > 14/05/22 14:26:48 INFO spark.HttpServer: Starting HTTP Server
>> >>> > [error] (run-main) java.lang.NoClassDefFoundError:
>> >>> > javax/servlet/http/HttpServletResponse
>> >>> > java.lang.NoClassDefFoundError:
>> javax/servlet/http/HttpServletResponse
>> >>> > at org.apache.spark.HttpServer.start(HttpServer.scala:54)
>> >>> > at
>> >>> >
>> >>> >
>> org.apache.spark.broadcast.HttpBroadcast$.createServer(HttpBroadcast.scala:156)
>> >>> > at
>> >>> >
>> >>> >
>> org.apache.spark.broadcast.HttpBroadcast$.initialize(HttpBroadcast.scala:127)
>> >>> > at
>> >>> >
>> >>> >
>> org.apache.spark.broadcast.HttpBroadcastFactory.initialize(HttpBroadcastFactory.scala:31)
>> >>> > at
>> >>> >
>> >>> >
>> org.apache.spark.broadcast.BroadcastManager.initialize(BroadcastManager.scala:48)
>> >>> > at
>> >>> >
>> >>> >
>> org.apache.spark.broadcast.BroadcastManager.<init>(BroadcastManager.scala:35)
>> >>> > at org.apache.spark.SparkEnv$.create(SparkEnv.scala:218)
>> >>> > at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
>> >>> > at
>> >>> >
>> >>> >
>> org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:549)
>> >>> > at
>> >>> >
>> >>> >
>> org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:561)
>> >>> > at
>> >>> >
>> >>> >
>> org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
>> >>> > at
>> >>> >
>> com.shrikar.sparkapps.NetworkWordCount$.main(NetworkWordCount.scala:39)
>> >>> > at
>> com.shrikar.sparkapps.NetworkWordCount.main(NetworkWordCount.scala)
>> >>> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >>> > at
>> >>> >
>> >>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >>> > at
>> >>> >
>> >>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >>> > at java.lang.reflect.Method.invoke(Method.java:597)
>> >>> >
>> >>> >
>> >>> > Thanks,
>> >>> > Shrikar
>> >>> >
>> >>>
>> >>>
>> >>>
>> >>> --
>> >>> Jacek Laskowski | http://blog.japila.pl
>> >>> "Never discourage anyone who continually makes progress, no matter how
>> >>> slow." Plato
>> >>
>> >>
>> >
>>
>
>

Mime
View raw message