spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Artemis User <arte...@dtechspace.com>
Subject Re: Issue with Running Spark in Jupyter Notebook
Date Thu, 24 Jun 2021 14:44:35 GMT
Looks like you didn't set up your environment properly.  I assume you 
are running this from a standalone python program instead of from the 
pyspark shell.  I would first run your code from the pyspark shell, then 
follow the spark python installation guide to set up your python 
environment properly.  Please note these are extra steps in addition to 
Spark installation.

-- ND

On 6/24/21 3:08 AM, Hsu, Philip wrote:
>
> Hi there,
>
> My name is Philip, a master’s student at Imperial College London. I’m 
> trying to use Spark to complete my course work assignment. I ran the 
> following code:
>
> from pyspark import SparkContext
>
> sc = SparkContext.getOrCreate()
>
> and got the following error message:
>
> Py4JJavaError: An error occurred while calling 
> None.org.apache.spark.api.java.JavaSparkContext.
>
> : java.lang.NoClassDefFoundError: Could not initialize class 
> org.sparkproject.jetty.http.MimeTypes
>
>         at 
> org.sparkproject.jetty.server.handler.gzip.GzipHandler.<init>(GzipHandler.java:190)
>
>         at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:485)
>
>         at org.apache.spark.ui.WebUI.$anonfun$bind$3(WebUI.scala:147)
>
>         at 
> org.apache.spark.ui.WebUI.$anonfun$bind$3$adapted(WebUI.scala:147)
>
>         at 
> scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
>
>         at 
> scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
>
>         at 
> scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
>
>         at org.apache.spark.ui.WebUI.bind(WebUI.scala:147)
>
>         at 
> org.apache.spark.SparkContext.$anonfun$new$11(SparkContext.scala:486)
>
>         at 
> org.apache.spark.SparkContext.$anonfun$new$11$adapted(SparkContext.scala:486)
>
>         at scala.Option.foreach(Option.scala:407)
>
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:486)
>
>         at 
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
>
>         at 
> java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
>
>         at 
> java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>
>         at 
> java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>
>         at 
> java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
>
>         at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
>
>         at 
> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
>
>         at py4j.Gateway.invoke(Gateway.java:238)
>
>         at 
> py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
>
>         at 
> py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
>
>         at py4j.GatewayConnection.run(GatewayConnection.java:238)
>
>         at java.base/java.lang.Thread.run(Thread.java:829)
>
> While in my Macbook’s terminal, it’s showing following error messages:
>
> WARNING: An illegal reflective access operation has occurred
>
> pyspark_mongodb_nb|WARNING: Illegal reflective access by 
> org.apache.spark.unsafe.Platform 
> (file:/usr/local/spark-3.1.2-bin-hadoop3.2/jars/spark-unsafe_2.12-3.1.2.jar) 
> to constructor java.nio.DirectByteBuffer(long,int)
>
> pyspark_mongodb_nb|WARNING: Please consider reporting this to the 
> maintainers of org.apache.spark.unsafe.Platform
>
> pyspark_mongodb_nb|WARNING: Use --illegal-access=warn to enable 
> warnings of further illegal reflective access operations
>
> pyspark_mongodb_nb|WARNING: All illegal access operations will be 
> denied in a future release
>
> pyspark_mongodb_nb|21/06/24 06:57:17 WARN NativeCodeLoader: Unable to 
> load native-hadoop library for your platform... using builtin-java 
> classes where applicable
>
> pyspark_mongodb_nb|Using Spark's default log4j profile: 
> org/apache/spark/log4j-defaults.properties
>
> pyspark_mongodb_nb|Setting default log level to "WARN".
>
> pyspark_mongodb_nb|To adjust logging level use 
> sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
>
> pyspark_mongodb_nb|21/06/24 06:57:20 WARN MacAddressUtil: Failed to 
> find a usable hardware address from the network interfaces; using 
> random bytes: bd:af:a7:b4:a2:46:2a:28
>
> I’m wondering if you could help me resolve the issues I have with my 
> laptop. I have a 2020 MacBook Pro with a M1 chip. Thank you so much in 
> advance.
>
> Best,
>
> Philip Hsu
>


Mime
View raw message