spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alex Minnaar <aminn...@verticalscope.com>
Subject RE: Example standalone app error!
Date Fri, 01 Aug 2014 18:01:49 GMT
I think this is the problem.  I was working in a project that inherited some other Akka dependencies
(of a different version).  I'm switching to a fresh new project which should solve the problem.

Thanks,

Alex
________________________________________
From: Tathagata Das <tathagata.das1565@gmail.com>
Sent: Thursday, July 31, 2014 8:36 PM
To: user@spark.apache.org
Subject: Re: Example standalone app error!

When are you guys getting the error? When Sparkcontext is created? Or
when it is being shutdown?
If this error is being thrown when the SparkContext is created, then
one possible reason maybe conflicting versions of Akka. Spark depends
on a version of Akka which is different from that of Scala, and
launching a spark app using Scala command (instead of Java) can cause
issues.

TD

On Thu, Jul 31, 2014 at 6:30 AM, Alex Minnaar
<aminnaar@verticalscope.com> wrote:
> I am eager to solve this problem.  So if anyone has any suggestions I would
> be glad to hear them.
>
>
> Thanks,
>
>
> Alex
>
> ________________________________
> From: Andrew Or <andrew@databricks.com>
> Sent: Tuesday, July 29, 2014 4:53 PM
> To: user@spark.apache.org
> Subject: Re: Example standalone app error!
>
> Hi Alex,
>
> Very strange. This error occurs when someone tries to call an abstract
> method. I have run into this before and resolved it with a SBT clean
> followed by an assembly, so maybe you could give that a try.
>
> Let me know if that fixes it,
> Andrew
>
>
> 2014-07-29 13:01 GMT-07:00 Alex Minnaar <aminnaar@verticalscope.com>:
>>
>> I am trying to run an example Spark standalone app with the following code
>>
>> import org.apache.spark.streaming._
>> import org.apache.spark.streaming.StreamingContext._
>>
>> object SparkGensimLDA extends App{
>>
>>   val ssc=new StreamingContext("local","testApp",Seconds(5))
>>
>>   val lines=ssc.textFileStream("/.../spark_example/")
>>
>>   val words=lines.flatMap(_.split(" "))
>>
>>   val wordCounts=words.map(x => (x,1)).reduceByKey(_ + _)
>>
>>   wordCounts.print()
>>
>>
>>   ssc.start()
>>   ssc.awaitTermination()
>>
>> }
>>
>>
>> However I am getting the following error
>>
>>
>> 15:35:40.170 [spark-akka.actor.default-dispatcher-2] ERROR
>> akka.actor.ActorSystemImpl - Uncaught fatal error from thread
>> [spark-akka.actor.default-dispatcher-3] shutting down ActorSystem [spark]
>> java.lang.AbstractMethodError: null
>> at akka.actor.ActorCell.create(ActorCell.scala:580)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:456)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at akka.dispatch.Mailbox.run(Mailbox.scala:219)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at
>> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>> [akka-actor_2.10-2.3.2.jar:na]
>> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>> [scala-library-2.10.4.jar:na]
>> at
>> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>> [scala-library-2.10.4.jar:na]
>> at
>> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>> [scala-library-2.10.4.jar:na]
>> at
>> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>> [scala-library-2.10.4.jar:na]
>> 15:35:40.171 [spark-akka.actor.default-dispatcher-2] ERROR
>> akka.actor.ActorSystemImpl - Uncaught fatal error from thread
>> [spark-akka.actor.default-dispatcher-4] shutting down ActorSystem [spark]
>> java.lang.AbstractMethodError: null
>> at akka.actor.ActorCell.create(ActorCell.scala:580)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:456)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at akka.dispatch.Mailbox.run(Mailbox.scala:219)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at
>> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>> [akka-actor_2.10-2.3.2.jar:na]
>> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>> [scala-library-2.10.4.jar:na]
>> at
>> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>> [scala-library-2.10.4.jar:na]
>> at
>> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>> [scala-library-2.10.4.jar:na]
>> at
>> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>> [scala-library-2.10.4.jar:na]
>> 15:35:40.175 [main] DEBUG o.a.spark.storage.DiskBlockManager - Creating
>> local directories at root dirs
>> '/var/folders/6y/h1f088_j007_d11kpwb1jg6m0000gp/T/'
>> 15:35:40.176 [spark-akka.actor.default-dispatcher-4] ERROR
>> akka.actor.ActorSystemImpl - Uncaught fatal error from thread
>> [spark-akka.actor.default-dispatcher-2] shutting down ActorSystem [spark]
>> java.lang.AbstractMethodError:
>> org.apache.spark.storage.BlockManagerMasterActor.aroundPostStop()V
>> at
>> akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:210)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at
>> akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:172)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at akka.actor.ActorCell.terminate(ActorCell.scala:369)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:462)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at akka.dispatch.Mailbox.run(Mailbox.scala:219)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at
>> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>> [akka-actor_2.10-2.3.2.jar:na]
>> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>> [scala-library-2.10.4.jar:na]
>> at
>> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>> [scala-library-2.10.4.jar:na]
>> at
>> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>> [scala-library-2.10.4.jar:na]
>> at
>> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>> [scala-library-2.10.4.jar:na]
>> 15:35:40.177 [spark-akka.actor.default-dispatcher-4] ERROR
>> akka.actor.ActorSystemImpl - Uncaught fatal error from thread
>> [spark-akka.actor.default-dispatcher-4] shutting down ActorSystem [spark]
>> java.lang.AbstractMethodError:
>> org.apache.spark.MapOutputTrackerMasterActor.aroundPostStop()V
>> at
>> akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:210)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at
>> akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:172)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at akka.actor.ActorCell.terminate(ActorCell.scala:369)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:462)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at akka.dispatch.Mailbox.run(Mailbox.scala:219)
>> ~[akka-actor_2.10-2.3.2.jar:na]
>> at
>> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>> [akka-actor_2.10-2.3.2.jar:na]
>> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>> [scala-library-2.10.4.jar:na]
>> at
>> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>> [scala-library-2.10.4.jar:na]
>> at
>> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>> [scala-library-2.10.4.jar:na]
>> at
>> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>> [scala-library-2.10.4.jar:na]
>> 15:35:40.177 [main] INFO  o.a.spark.storage.DiskBlockManager - Created
>> local directory at
>> /var/folders/6y/h1f088_j007_d11kpwb1jg6m0000gp/T/spark-local-20140728153540-2b99
>> 15:35:40.180 [spark-akka.actor.default-dispatcher-4] INFO
>> a.r.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
>> 15:35:40.181 [main] INFO  org.apache.spark.storage.MemoryStore -
>> MemoryStore started with capacity 2.1 GB.
>> 15:35:40.182 [spark-akka.actor.default-dispatcher-4] INFO
>> a.r.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down;
>> proceeding with flushing remote transports.
>> 15:35:40.208 [main] INFO  o.a.spark.network.ConnectionManager - Bound
>> socket to port 62355 with id = ConnectionManagerId(10.10.6.5,62355)
>> 15:35:40.209 [spark-akka.actor.default-dispatcher-4] INFO
>> a.r.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
>> Exception in thread "main" java.lang.IllegalStateException: cannot create
>> children while terminating or terminated
>> at akka.actor.dungeon.Children$class.makeChild(Children.scala:200)
>> at akka.actor.dungeon.Children$class.attachChild(Children.scala:42)
>> at akka.actor.ActorCell.attachChild(ActorCell.scala:369)
>> at akka.actor.ActorSystemImpl.actorOf(ActorSystem.scala:552)
>> at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:104)
>> at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:145)
>> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:213)
>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
>> at
>> org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:549)
>> at
>> org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:561)
>> at
>> org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
>> at
>> com.verticalscope.nlp.topics.SparkGensimLDA$delayedInit$body.apply(SparkGensimLDA.scala:14)
>> at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
>> at
>> scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
>> at scala.App$$anonfun$main$1.apply(App.scala:71)
>> at scala.App$$anonfun$main$1.apply(App.scala:71)
>> at scala.collection.immutable.List.foreach(List.scala:318)
>> at
>> scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
>> at scala.App$class.main(App.scala:71)
>> at
>> com.verticalscope.nlp.topics.SparkGensimLDA$.main(SparkGensimLDA.scala:12)
>> at com.verticalscope.nlp.topics.SparkGensimLDA.main(SparkGensimLDA.scala)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
>> 15:35:40.213 [delete Spark local dirs] DEBUG
>> o.a.spark.storage.DiskBlockManager - Shutdown hook called
>>
>> Process finished with exit code 1
>>
>>
>> I did a search for java.lang.AbstractMethodError occurring with Spark but
>> I could not find a solution.  Could someone identify what the problem is
>> here? I am using version 1.0.1
>>
>> Thanks,
>>
>> Alex
>>
>>
>

Mime
View raw message