spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rob Vesse <rve...@dotnetrdf.org>
Subject Re: Spark build can't find javac
Date Tue, 30 Apr 2019 08:54:34 GMT
I have seen issues with some versions of the Scala Maven plugin auto-detecting the wrong JAVA_HOME
when both a JRE and JDK are present on the system.  Setting JAVA_HOME explicitly to a JDK
skips the plugins auto-detect logic and avoids the problem.

 

This may be related - https://github.com/davidB/scala-maven-plugin/pull/227 and https://github.com/davidB/scala-maven-plugin/issues/221


Rob

 

From: Sean Owen <srowen@gmail.com>
Date: Tuesday, 30 April 2019 at 00:18
To: Shmuel Blitz <shmuel.blitz@similarweb.com>
Cc: dev <dev@spark.apache.org>
Subject: Re: Spark build can't find javac

 

Your JAVA_HOME is pointing to a JRE rather than JDK installation. Or you've actually installed
the JRE. Only the JDK has javac, etc.

 

On Mon, Apr 29, 2019 at 4:36 PM Shmuel Blitz <shmuel.blitz@similarweb.com> wrote:

Hi,

 

Trying to build Spark on Manjaro with OpenJDK version 1.8.0_212, and I'm getting the following
error:

 

Cannot run program "/usr/lib/jvm/java-8-openjdk/jre/bin/javac": error=2, No such file or directory

> which javac

/usr/bin/javac

 

only when I set JAVA_HOME as follows, do I get it to run.

> export JAVA_HOME=/usr/lib/jvm/default

 

 

Any idea what the issue is?

-- 

Shmuel Blitz 
Data Analysis Team Leader 
Email: shmuel.blitz@similarweb.com 
www.similarweb.com 
 


Mime
View raw message