spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mukhtar Ali <mukhtarali....@gmail.com>
Subject Fwd: issue in Apache Spark install
Date Wed, 08 Sep 2021 11:04:04 GMT
Dear

Learning member of  of https://learning.oreilly.com
some problem in install Apache Spark
I try both CMD and Jupyter file
same issue* Exception: Java gateway process exited before sending its port
number*
please resolve this issue
find the attachment in Jupyter


In CMD
C:\Users\User>pyspark
Python 3.8.8 (default, Apr 13 2021, 15:08:03) [MSC v.1916 64 bit (AMD64)]
:: Anaconda, Inc. on win32

Warning:
This Python interpreter is in a conda environment, but the environment has
not been activated.  Libraries may fail to load.  To activate this
environment
please see https://conda.io/activation

Type "help", "copyright", "credits" or "license" for more information.
Exception in thread "main" java.lang.ExceptionInInitializerError
        at
org.apache.spark.unsafe.array.ByteArrayMethods.<clinit>(ByteArrayMethods.java:54)
        at
org.apache.spark.internal.config.package$.<init>(package.scala:1095)
        at org.apache.spark.internal.config.package$.<clinit>(package.scala)
        at
org.apache.spark.deploy.SparkSubmitArguments.$anonfun$loadEnvironmentArguments$3(SparkSubmitArguments.scala:157)
        at scala.Option.orElse(Option.scala:447)
        at
org.apache.spark.deploy.SparkSubmitArguments.loadEnvironmentArguments(SparkSubmitArguments.scala:157)
        at
org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:115)
        at
org.apache.spark.deploy.SparkSubmit$$anon$2$$anon$3.<init>(SparkSubmit.scala:1022)
        at
org.apache.spark.deploy.SparkSubmit$$anon$2.parseArguments(SparkSubmit.scala:1022)
        at
org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:85)
        at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make
private java.nio.DirectByteBuffer(long,int) accessible: module java.base
does not "opens java.nio" to unnamed module @71e9ddb4
        at
java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:357)
        at
java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297)
        at
java.base/java.lang.reflect.Constructor.checkCanSetAccessible(Constructor.java:188)
        at
java.base/java.lang.reflect.Constructor.setAccessible(Constructor.java:181)
        at org.apache.spark.unsafe.Platform.<clinit>(Platform.java:56)
        ... 13 more
Traceback (most recent call last):
  File "C:\Spark\spark-3.1.2-bin-hadoop2.7\python\pyspark\shell.py", line
35, in <module>
    SparkContext._ensure_initialized()  # type: ignore
  File "C:\Spark\spark-3.1.2-bin-hadoop2.7\python\pyspark\context.py", line
331, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway(conf)
  File "C:\Spark\spark-3.1.2-bin-hadoop2.7\python\pyspark\java_gateway.py",
line 108, in launch_gateway
    raise Exception("Java gateway process exited before sending its port
number")
Exception: Java gateway process exited before sending its port number

Mime
View raw message