spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Kelly, Jonathan" <jonat...@amazon.com>
Subject Spark and OpenJDK - jar: No such file or directory
Date Mon, 30 Mar 2015 20:03:29 GMT
I'm trying to use OpenJDK 7 with Spark 1.3.0 and noticed that the compute-classpath.sh script
is not adding the datanucleus jars to the classpath because compute-classpath.sh is assuming
to find the jar command in $JAVA_HOME/bin/jar, which does not exist for OpenJDK.  Is this
an issue anybody else has run into?  Would it be possible to use the unzip command instead?

The fact that $JAVA_HOME/bin/jar is missing also breaks the check that ensures that Spark
was built with a compatible version of java to the one being used to launch spark.  The unzip
tool of course wouldn't work for this, but there's probably another easy alternative to $JAVA_HOME/bin/jar.

~ Jonathan Kelly

Mime
View raw message