spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Stephen Coy <>
Subject Re: Unit test failure in spark-core
Date Tue, 13 Oct 2020 01:50:02 GMT
Sorry, I forgot:

[scoy@Steves-Core-i9-2 core]$ java -version
openjdk version "1.8.0_262"
OpenJDK Runtime Environment (AdoptOpenJDK)(build 1.8.0_262-b10)
OpenJDK 64-Bit Server VM (AdoptOpenJDK)(build 25.262-b10, mixed mode)

which is on MacOS 10.15.7

On 13 Oct 2020, at 12:47 pm, Stephen Coy <<>>

Hi all,

When trying to build current master with a simple:

mvn clean install

I get a consistent unit test failure in core:

[ERROR] Tests run: 6, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 5.403 s <<<
FAILURE! - in org.apache.spark.launcher.SparkLauncherSuite
[ERROR] testSparkLauncherGetError(org.apache.spark.launcher.SparkLauncherSuite)  Time elapsed:
2.015 s  <<< FAILURE!
at org.apache.spark.launcher.SparkLauncherSuite.testSparkLauncherGetError(

I believe the applicable messages from the unit-tests.log file are:

20/10/13 12:20:35.875 spark-app-1: '<unknown>' WARN InProcessAppHandle: Application
failed with exception.
org.apache.spark.SparkException: Failed to get main class in JAR with error 'File spark-internal
does not exist'.  Please specify one with --class.
        at org.apache.spark.deploy.SparkSubmit.error(SparkSubmit.scala:942)
        at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:457)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at org.apache.spark.deploy.InProcessSparkSubmit$.main(SparkSubmit.scala:954)
        at org.apache.spark.deploy.InProcessSparkSubmit.main(SparkSubmit.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(
        at java.lang.reflect.Method.invoke(
        at org.apache.spark.launcher.InProcessAppHandle.lambda$start$0(

org.apache.spark.launcher.SparkLauncherSuite#testSparkLauncherGetError is the failing test,
so I improved the the failing assertion by changing it from:



      assertThat(handle.getError().get().getMessage(), containsString(EXCEPTION_MESSAGE));

This yields:

[ERROR] Tests run: 6, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 7.155 s <<<
FAILURE! - in org.apache.spark.launcher.SparkLauncherSuite
[ERROR] testSparkLauncherGetError(org.apache.spark.launcher.SparkLauncherSuite)  Time elapsed:
2.02 s  <<< FAILURE!

Expected: a string containing "dummy-exception"
     but: was "Error: Failed to load class org.apache.spark.launcher.SparkLauncherSuite$ErrorInProcessTestApp."
at org.apache.spark.launcher.SparkLauncherSuite.testSparkLauncherGetError(

Which loosely correlates with error in unit-tests.log.

Any ideas?


Steve C

This email contains confidential information of and is the copyright of Infomedia. It must
not be forwarded, amended or disclosed without consent of the sender. If you received this
message by mistake, please advise the sender and delete all copies. Security of transmission
on the internet cannot be guaranteed, could be infected, intercepted, or corrupted and you
should ensure you have suitable antivirus protection in place. By sending us your or any third
party personal details, you consent to (or confirm you have obtained consent from such third
parties) to Infomedia’s privacy policy.

View raw message