spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Paul Praet (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-24502) flaky test: UnsafeRowSerializerSuite
Date Thu, 09 Aug 2018 07:33:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-24502?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16574410#comment-16574410
] 

Paul Praet commented on SPARK-24502:
------------------------------------

Find it hard to believe above pull requests were accepted, rather than tackling the root cause
which is the apparent resource leakage when closing a spark session.

We are upstepping from Spark 2.2.1 to 2.3.1 and we find our tests have become flaky as well
(when running them consecutively). Even in our production code we are creating and closing
multiple spark sessions... 

I prefer not to pollute our code with those boilerplate statements.

> flaky test: UnsafeRowSerializerSuite
> ------------------------------------
>
>                 Key: SPARK-24502
>                 URL: https://issues.apache.org/jira/browse/SPARK-24502
>             Project: Spark
>          Issue Type: Test
>          Components: SQL
>    Affects Versions: 2.4.0
>            Reporter: Wenchen Fan
>            Assignee: Wenchen Fan
>            Priority: Major
>              Labels: flaky-test
>             Fix For: 2.3.2, 2.4.0
>
>
> https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/4193/testReport/org.apache.spark.sql.execution/UnsafeRowSerializerSuite/toUnsafeRow___test_helper_method/
> {code}
> sbt.ForkMain$ForkError: java.lang.IllegalStateException: LiveListenerBus is stopped.
> 	at org.apache.spark.scheduler.LiveListenerBus.addToQueue(LiveListenerBus.scala:97)
> 	at org.apache.spark.scheduler.LiveListenerBus.addToStatusQueue(LiveListenerBus.scala:80)
> 	at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:93)
> 	at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:120)
> 	at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:120)
> 	at scala.Option.getOrElse(Option.scala:121)
> 	at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:120)
> 	at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:119)
> 	at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:286)
> 	at org.apache.spark.sql.test.TestSparkSession.sessionState$lzycompute(TestSQLContext.scala:42)
> 	at org.apache.spark.sql.test.TestSparkSession.sessionState(TestSQLContext.scala:41)
> 	at org.apache.spark.sql.SparkSession$$anonfun$1$$anonfun$apply$1.apply(SparkSession.scala:95)
> 	at org.apache.spark.sql.SparkSession$$anonfun$1$$anonfun$apply$1.apply(SparkSession.scala:95)
> 	at scala.Option.map(Option.scala:146)
> 	at org.apache.spark.sql.SparkSession$$anonfun$1.apply(SparkSession.scala:95)
> 	at org.apache.spark.sql.SparkSession$$anonfun$1.apply(SparkSession.scala:94)
> 	at org.apache.spark.sql.internal.SQLConf$.get(SQLConf.scala:126)
> 	at org.apache.spark.sql.catalyst.expressions.CodeGeneratorWithInterpretedFallback.createObject(CodeGeneratorWithInterpretedFallback.scala:54)
> 	at org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:157)
> 	at org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:150)
> 	at org.apache.spark.sql.execution.UnsafeRowSerializerSuite.org$apache$spark$sql$execution$UnsafeRowSerializerSuite$$unsafeRowConverter(UnsafeRowSerializerSuite.scala:54)
> 	at org.apache.spark.sql.execution.UnsafeRowSerializerSuite.org$apache$spark$sql$execution$UnsafeRowSerializerSuite$$toUnsafeRow(UnsafeRowSerializerSuite.scala:49)
> 	at org.apache.spark.sql.execution.UnsafeRowSerializerSuite$$anonfun$2.apply(UnsafeRowSerializerSuite.scala:63)
> 	at org.apache.spark.sql.execution.UnsafeRowSerializerSuite$$anonfun$2.apply(UnsafeRowSerializerSuite.scala:60)
> ...
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message