spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <sro...@gmail.com>
Subject Re: issue in Apache Spark install
Date Thu, 09 Sep 2021 13:52:29 GMT
- other lists, please don't cross post to 4 lists (!)

This is a problem you'd see with Java 9 or later - I assume you're running
that under the hood. However it should be handled by Spark in the case that
you can't access certain things in Java 9+, and this may be a bug I'll look
into. In the meantime you could try adding "--add-opens
java.base/java.lang=ALL-UNNAMED" to the Java command line if you can, or
try to use Java 8, as a temporary workaround, to see if that helps.


On Wed, Sep 8, 2021 at 6:50 AM Mukhtar Ali <mukhtarali.jmi@gmail.com> wrote:

> Dear
>
> Learning member of  of https://learning.oreilly.com
> some problem in install Apache Spark
> I try both CMD and Jupyter file
> same issue* Exception: Java gateway process exited before sending its
> port number*
> please resolve this issue
> find the attachment in Jupyter
>
>
> In CMD
> C:\Users\User>pyspark
> Python 3.8.8 (default, Apr 13 2021, 15:08:03) [MSC v.1916 64 bit (AMD64)]
> :: Anaconda, Inc. on win32
>
> Warning:
> This Python interpreter is in a conda environment, but the environment has
> not been activated.  Libraries may fail to load.  To activate this
> environment
> please see https://conda.io/activation
>
> Type "help", "copyright", "credits" or "license" for more information.
> Exception in thread "main" java.lang.ExceptionInInitializerError
>         at
> org.apache.spark.unsafe.array.ByteArrayMethods.<clinit>(ByteArrayMethods.java:54)
>         at
> org.apache.spark.internal.config.package$.<init>(package.scala:1095)
>         at
> org.apache.spark.internal.config.package$.<clinit>(package.scala)
>         at
> org.apache.spark.deploy.SparkSubmitArguments.$anonfun$loadEnvironmentArguments$3(SparkSubmitArguments.scala:157)
>         at scala.Option.orElse(Option.scala:447)
>         at
> org.apache.spark.deploy.SparkSubmitArguments.loadEnvironmentArguments(SparkSubmitArguments.scala:157)
>         at
> org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:115)
>         at
> org.apache.spark.deploy.SparkSubmit$$anon$2$$anon$3.<init>(SparkSubmit.scala:1022)
>         at
> org.apache.spark.deploy.SparkSubmit$$anon$2.parseArguments(SparkSubmit.scala:1022)
>         at
> org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:85)
>         at
> org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
>         at
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make
> private java.nio.DirectByteBuffer(long,int) accessible: module java.base
> does not "opens java.nio" to unnamed module @71e9ddb4
>         at
> java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:357)
>         at
> java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297)
>         at
> java.base/java.lang.reflect.Constructor.checkCanSetAccessible(Constructor.java:188)
>         at
> java.base/java.lang.reflect.Constructor.setAccessible(Constructor.java:181)
>         at org.apache.spark.unsafe.Platform.<clinit>(Platform.java:56)
>         ... 13 more
> Traceback (most recent call last):
>   File "C:\Spark\spark-3.1.2-bin-hadoop2.7\python\pyspark\shell.py", line
> 35, in <module>
>     SparkContext._ensure_initialized()  # type: ignore
>   File "C:\Spark\spark-3.1.2-bin-hadoop2.7\python\pyspark\context.py",
> line 331, in _ensure_initialized
>     SparkContext._gateway = gateway or launch_gateway(conf)
>   File
> "C:\Spark\spark-3.1.2-bin-hadoop2.7\python\pyspark\java_gateway.py", line
> 108, in launch_gateway
>     raise Exception("Java gateway process exited before sending its port
> number")
> Exception: Java gateway process exited before sending its port number
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org

Mime
View raw message