spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rob Vesse <>
Subject Re: Spark build can't find javac
Date Tue, 30 Apr 2019 08:54:34 GMT
I have seen issues with some versions of the Scala Maven plugin auto-detecting the wrong JAVA_HOME
when both a JRE and JDK are present on the system.  Setting JAVA_HOME explicitly to a JDK
skips the plugins auto-detect logic and avoids the problem.


This may be related - and



From: Sean Owen <>
Date: Tuesday, 30 April 2019 at 00:18
To: Shmuel Blitz <>
Cc: dev <>
Subject: Re: Spark build can't find javac


Your JAVA_HOME is pointing to a JRE rather than JDK installation. Or you've actually installed
the JRE. Only the JDK has javac, etc.


On Mon, Apr 29, 2019 at 4:36 PM Shmuel Blitz <> wrote:



Trying to build Spark on Manjaro with OpenJDK version 1.8.0_212, and I'm getting the following


Cannot run program "/usr/lib/jvm/java-8-openjdk/jre/bin/javac": error=2, No such file or directory

> which javac



only when I set JAVA_HOME as follows, do I get it to run.

> export JAVA_HOME=/usr/lib/jvm/default



Any idea what the issue is?


Shmuel Blitz 
Data Analysis Team Leader 

View raw message