spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hyukjin Kwon (Jira)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-32027) EventLoggingListener threw java.util.ConcurrentModificationException
Date Wed, 01 Jul 2020 10:56:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-32027?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17149321#comment-17149321
] 

Hyukjin Kwon commented on SPARK-32027:
--------------------------------------

[~yumwang] can you write down more details if you're not going to make a fix?

> EventLoggingListener threw  java.util.ConcurrentModificationException
> ---------------------------------------------------------------------
>
>                 Key: SPARK-32027
>                 URL: https://issues.apache.org/jira/browse/SPARK-32027
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.1.0
>            Reporter: Yuming Wang
>            Priority: Major
>
> {noformat}
> 20/06/18 20:22:25 ERROR AsyncEventQueue: Listener EventLoggingListener threw an exception
> java.util.ConcurrentModificationException
> 	at java.util.Hashtable$Enumerator.next(Hashtable.java:1387)
> 	at scala.collection.convert.Wrappers$JPropertiesWrapper$$anon$6.next(Wrappers.scala:424)
> 	at scala.collection.convert.Wrappers$JPropertiesWrapper$$anon$6.next(Wrappers.scala:420)
> 	at scala.collection.Iterator.foreach(Iterator.scala:941)
> 	at scala.collection.Iterator.foreach$(Iterator.scala:941)
> 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
> 	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
> 	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
> 	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
> 	at scala.collection.TraversableLike.map(TraversableLike.scala:238)
> 	at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
> 	at scala.collection.AbstractTraversable.map(Traversable.scala:108)
> 	at org.apache.spark.util.JsonProtocol$.mapToJson(JsonProtocol.scala:568)
> 	at org.apache.spark.util.JsonProtocol$.$anonfun$propertiesToJson$1(JsonProtocol.scala:574)
> 	at scala.Option.map(Option.scala:230)
> 	at org.apache.spark.util.JsonProtocol$.propertiesToJson(JsonProtocol.scala:573)
> 	at org.apache.spark.util.JsonProtocol$.jobStartToJson(JsonProtocol.scala:159)
> 	at org.apache.spark.util.JsonProtocol$.sparkEventToJson(JsonProtocol.scala:81)
> 	at org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:97)
> 	at org.apache.spark.scheduler.EventLoggingListener.onJobStart(EventLoggingListener.scala:159)
> 	at org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:37)
> 	at org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28)
> 	at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37)
> 	at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37)
> 	at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:115)
> 	at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:99)
> 	at org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:105)
> 	at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:105)
> 	at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
> 	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
> 	at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100)
> 	at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96)
> 	at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319)
> 	at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96)
> 20/06/18 20:22:25 ERROR AsyncEventQueue: Listener EventLoggingListener threw an exception
> java.util.ConcurrentModificationException
> 	at java.util.Hashtable$Enumerator.next(Hashtable.java:1387)
> 	at scala.collection.convert.Wrappers$JPropertiesWrapper$$anon$6.next(Wrappers.scala:424)
> 	at scala.collection.convert.Wrappers$JPropertiesWrapper$$anon$6.next(Wrappers.scala:420)
> 	at scala.collection.Iterator.foreach(Iterator.scala:941)
> 	at scala.collection.Iterator.foreach$(Iterator.scala:941)
> 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
> 	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
> 	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
> 	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
> 	at scala.collection.TraversableLike.map(TraversableLike.scala:238)
> 	at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
> 	at scala.collection.AbstractTraversable.map(Traversable.scala:108)
> 	at org.apache.spark.util.JsonProtocol$.mapToJson(JsonProtocol.scala:568)
> 	at org.apache.spark.util.JsonProtocol$.$anonfun$propertiesToJson$1(JsonProtocol.scala:574)
> 	at scala.Option.map(Option.scala:230)
> 	at org.apache.spark.util.JsonProtocol$.propertiesToJson(JsonProtocol.scala:573)
> 	at org.apache.spark.util.JsonProtocol$.jobStartToJson(JsonProtocol.scala:159)
> 	at org.apache.spark.util.JsonProtocol$.sparkEventToJson(JsonProtocol.scala:81)
> 	at org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:97)
> 	at org.apache.spark.scheduler.EventLoggingListener.onJobStart(EventLoggingListener.scala:159)
> 	at org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:37)
> 	at org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28)
> 	at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37)
> 	at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37)
> 	at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:115)
> 	at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:99)
> 	at org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:105)
> 	at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:105)
> 	at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23)
> 	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
> 	at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100)
> 	at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96)
> 	at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319)
> 	at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96)
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message