spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Imran Rajjad <raj...@gmail.com>
Subject unable to import graphframes
Date Tue, 29 Aug 2017 07:13:52 GMT
Dear list,

I am following the documentation of graphframe and have started the scala
shell using following command


D:\spark-2.1.0-bin-hadoop2.7\bin>spark-shell --master local[2] --packages
graphframes:graphframes:0.5.0-spark2.1-s_2.10

Ivy Default Cache set to: C:\Users\user\.ivy2\cache
The jars for the packages stored in: C:\Users\user\.ivy2\jars
:: loading settings :: url =
jar:file:/D:/spark-2.1.0-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
graphframes#graphframes added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
        confs: [default]
        found graphframes#graphframes;0.5.0-spark2.1-s_2.10 in
spark-packages
        found com.typesafe.scala-logging#scala-logging-api_2.10;2.1.2 in
central
        found com.typesafe.scala-logging#scala-logging-slf4j_2.10;2.1.2 in
central
        found org.scala-lang#scala-reflect;2.10.4 in central
        found org.slf4j#slf4j-api;1.7.7 in local-m2-cache
:: resolution report :: resolve 288ms :: artifacts dl 7ms
        :: modules in use:
        com.typesafe.scala-logging#scala-logging-api_2.10;2.1.2 from
central in [default]
        com.typesafe.scala-logging#scala-logging-slf4j_2.10;2.1.2 from
central in [default]
        graphframes#graphframes;0.5.0-spark2.1-s_2.10 from spark-packages
in [default]
        org.scala-lang#scala-reflect;2.10.4 from central in [default]
        org.slf4j#slf4j-api;1.7.7 from local-m2-cache in [default]

---------------------------------------------------------------------
        |                  |            modules            ||   artifacts
|
        |       conf       | number| search|dwnlded|evicted||
number|dwnlded|

---------------------------------------------------------------------
        |      default     |   5   |   0   |   0   |   0   ||   5   |   0
|

---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent
        confs: [default]
        0 artifacts copied, 5 already retrieved (0kB/7ms)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
setLogLevel(newLevel).
2017-08-29 12:10:23,089 [main] WARN  NativeCodeLoader  - Unable to load
native-hadoop library for your platform... using builtin-java classes where
applicable
2017-08-29 12:10:25,128 [main] WARN  General  - Plugin (Bundle)
"org.datanucleus.store.rdbms" is already registered. Ensure you dont have
multiple JAR versions of the same plugin in the classpath. The URL
"file:/D:/spark-2.1.0-bin-hadoop2.7/jars/datanucleus-rdbms-3.2.9.jar" is
already registered, and you are trying to register an identical plugin
located at URL
"file:/D:/spark-2.1.0-bin-hadoop2.7/bin/../jars/datanucleus-rdbms-3.2.9.jar."
2017-08-29 12:10:25,137 [main] WARN  General  - Plugin (Bundle)
"org.datanucleus" is already registered. Ensure you dont have multiple JAR
versions of the same plugin in the classpath. The URL
"file:/D:/spark-2.1.0-bin-hadoop2.7/jars/datanucleus-core-3.2.10.jar" is
already registered, and you are trying to register an identical plugin
located at URL
"file:/D:/spark-2.1.0-bin-hadoop2.7/bin/../jars/datanucleus-core-3.2.10.jar."
2017-08-29 12:10:25,141 [main] WARN  General  - Plugin (Bundle)
"org.datanucleus.api.jdo" is already registered. Ensure you dont have
multiple JAR versions of the same plugin in the classpath. The URL
"file:/D:/spark-2.1.0-bin-hadoop2.7/bin/../jars/datanucleus-api-jdo-3.2.6.jar"
is already registered, and you are trying to register an identical plugin
located at URL
"file:/D:/spark-2.1.0-bin-hadoop2.7/jars/datanucleus-api-jdo-3.2.6.jar."
2017-08-29 12:10:27,744 [main] WARN  ObjectStore  - Failed to get database
global_temp, returning NoSuchObjectException
Spark context Web UI available at http://192.168.10.60:4040
Spark context available as 'sc' (master = local[2], app id =
local-1503990623864).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.1.0
      /_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java
1.8.0_112)
Type in expressions to have them evaluated.
Type :help for more information.
scala>
scala> import org.graphframes._
<console>:23: error: object graphframes is not a member of package org
       import org.graphframes._

is there something missing?

regards,
Imran

-- 
I.R

Mime
View raw message