spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 吴文超 <wuwenc...@miaozhen.com>
Subject IntelliJ idea not work well with spark
Date Sun, 27 Mar 2016 14:20:34 GMT
I am a newbie to spark, when I use IntelliJ idea to write some scala code, i found it reports
error when using spark's implicit conversion.e.g. whe use the RDD as Pair RDD to get reduceByKey
function. However, the project can run normally in the cluster.
As somebody says it needs import org.apache.spark.SparkContext._ , http://stackoverflow.com/questions/24084335/reducebykey-method-not-being-found-in-intellij
I did it ,but it still gets error..
Has anybody encountered the problem and how do you solve it ?
BTY, I have tried both sbt and maven , and the idea version 14.0.3 and spark version is 1.6.0
Mime
View raw message