spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michael Jay <mich....@outlook.com>
Subject Re: spark 2.0 in intellij
Date Tue, 09 Aug 2016 23:13:19 GMT
Hi,


The problem has been solved simply by updating the scala sdk version from incompactible 2.10.x
to correct version 2.11.x

________________________________
From: Michael Jay <mich.jay@outlook.com>
Sent: Tuesday, August 9, 2016 10:11:12 PM
To: user@spark.apache.org
Subject: spark 2.0 in intellij


Dear all,

I am Newbie to Spark. Currently I am trying to import the source code of Spark 2.0 as a Module
to an existing client project.

I have imported Spark-core, Spark-sql and Spark-catalyst as maven dependencies in this client
project.

During compilation errors as missing SqlBaseParser.java occurred.

After searching online, I found an article in StackOverflow http://stackoverflow.com/questions/35617277/spark-sql-has-no-sparksqlparser-scala-file-when-compiling-in-intellij-idea
to solve this issue.

[http://cdn.sstatic.net/Sites/stackoverflow/img/apple-touch-icon@2.png?v=73d79a89bded&a]<http://stackoverflow.com/questions/35617277/spark-sql-has-no-sparksqlparser-scala-file-when-compiling-in-intellij-idea>

Spark SQL has no SparkSqlParser.scala file when compiling ...<http://stackoverflow.com/questions/35617277/spark-sql-has-no-sparksqlparser-scala-file-when-compiling-in-intellij-idea>
stackoverflow.com
I have installed spark-hadoop env in my Red Hat 64. And I also want to read and write code
in spark source code project in intelliJ idea. I have downloaded spark ...

So I use mvn to build spark 2.0 first and import the ...catalyst/target/generated-sources/antrl4
as a new source folder in the maven dependency "Spark-catalyst".
Now the problem is that I still got following erros:

Error:scalac: error while loading package, Missing dependency 'bad symbolic reference. A signature
in package.class refers to term annotation
in package org.apache.spark which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling package.class.',
required by /home/weiping/workspace/tools/spark-2.0.0/sql/core/target/scala-2.11/classes/org/apache/spark/sql/package.class
Error:scalac: error while loading SparkSession, Missing dependency 'bad symbolic reference.
A signature in SparkSession.class refers to term annotation
in package org.apache.spark which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling SparkSession.class.',
required by /home/weiping/workspace/tools/spark-2.0.0/sql/core/target/scala-2.11/classes/org/apache/spark/sql/SparkSession.class
Error:scalac: error while loading RDD, Missing dependency 'bad symbolic reference. A signature
in RDD.class refers to term annotation
in package org.apache.spark which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling RDD.class.', required
by /home/weiping/workspace/tools/spark-2.0.0/core/target/scala-2.11/classes/org/apache/spark/rdd/RDD.class
Error:scalac: error while loading JavaRDDLike, Missing dependency 'bad symbolic reference.
A signature in JavaRDDLike.class refers to term annotation
in package org.apache.spark which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling JavaRDDLike.class.',
required by /home/weiping/workspace/tools/spark-2.0.0/core/target/scala-2.11/classes/org/apache/spark/api/java/JavaRDDLike.class
Error:scalac: error while loading Dataset, Missing dependency 'bad symbolic reference. A signature
in Dataset.class refers to term annotation
in package org.apache.spark which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling Dataset.class.',
required by /home/weiping/workspace/tools/spark-2.0.0/sql/core/target/scala-2.11/classes/org/apache/spark/sql/Dataset.class
Error:scalac: error while loading ColumnName, Missing dependency 'bad symbolic reference.
A signature in ColumnName.class refers to term annotation
in package org.apache.spark which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling ColumnName.class.',
required by /home/weiping/workspace/tools/spark-2.0.0/sql/core/target/scala-2.11/classes/org/apache/spark/sql/ColumnName.class
Error:scalac: error while loading Encoder, Missing dependency 'bad symbolic reference. A signature
in Encoder.class refers to term annotation
in package org.apache.spark which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling Encoder.class.',
required by /home/weiping/workspace/tools/spark-2.0.0/sql/catalyst/target/scala-2.11/classes/org/apache/spark/sql/Encoder.class


Can anyone help me?

Thank you,
Mic

Mime
View raw message