Hi ,

I was able to build a spark app in IntelliJ using sbt..
Now I am trying to build it using maven and I am getting build failed.
I created a maven project by adding following archetype.Because simple-archetype was using scala 2.8 version and I am running 2.10.3

Now I added spark dependency as 


The Build is failing.The error message is like
INFO] Building SampleMVN 1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] --- maven-resources-plugin:2.3:resources (default-resources) @ SampleMVN ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/meethu/Intellij/SampleMVN/src/main/resources
[INFO] --- maven-compiler-plugin:2.0.2:compile (default-compile) @ SampleMVN ---
[INFO] Nothing to compile - all classes are up to date
[INFO] --- scala-maven-plugin:3.1.3:compile (default) @ SampleMVN ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.3
[WARNING]  mvn.SampleMVN:SampleMVN:1.0-SNAPSHOT requires scala version: 2.10.3
[WARNING]  com.twitter:chill_2.10:0.3.6 requires scala version: 2.10.3
[WARNING]  org.spark-project.akka:akka-actor_2.10:2.2.3-shaded-protobuf requires scala version: 2.10.2
[WARNING] Multiple versions of scala libraries detected!
[INFO] /home/meethu/Intellij/SampleMVN/src/main/scala:-1: info: compiling
[INFO] Compiling 2 source files to /home/meethu/Intellij/SampleMVN/target/classes at 1415700093124
[ERROR] error: error while loading <root>, error in opening zip file
[ERROR] error: scala.reflect.internal.MissingRequirementError: object scala.runtime in compiler mirror not found.
[ERROR] at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
[ERROR] at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
[ERROR] at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
[ERROR] at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
[ERROR] at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
[ERROR] at scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
[ERROR] at scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
[ERROR] at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)
[ERROR] at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)
[ERROR] at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)
[ERROR] at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)
[ERROR] at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)
[ERROR] at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)
[ERROR] at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)
[ERROR] at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)
[ERROR] at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)
[ERROR] at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)
[ERROR] at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)
[ERROR] at scala.tools.nsc.Global$Run.<init>(Global.scala:1290)
[ERROR] at scala.tools.nsc.Driver.doCompile(Driver.scala:32)
[ERROR] at scala.tools.nsc.Main$.doCompile(Main.scala:79)
[ERROR] at scala.tools.nsc.Driver.process(Driver.scala:54)
[ERROR] at scala.tools.nsc.Driver.main(Driver.scala:67)
[ERROR] at scala.tools.nsc.Main.main(Main.scala)
[ERROR] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[ERROR] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
[ERROR] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[ERROR] at java.lang.reflect.Method.invoke(Method.java:606)
[ERROR] at scala_maven_executions.MainHelper.runMain(MainHelper.java:164)
[ERROR] at scala_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)

Can anyone help to resolve this issue.?

Thanks & Regards,
Meethu M

On Tuesday, 11 November 2014 1:50 PM, MEETHU MATHEW <meethu2006@yahoo.co.in> wrote:

Hi Akhil,

It worked.."You might want to restart intelliJ sometime to get the dependencies pulled from the build.sbt file." This  helped really
Thanks & Regards,
Meethu M

On Monday, 10 November 2014 4:58 PM, Akhil Das <akhil@sigmoidanalytics.com> wrote:

Hi Meethu,

You can install IntelliJ IDE rather than eclipse which has a lot more community support and features. Once you install it, follow this simple post to get started with scala. Now you have an IDE which works with Scala, Then for Spark Follow the below steps:

1. Install sbt plugins: Goto File -> Settings -> Plugins -> Install IntelliJ Plugins -> Search for sbt and install it
Inline image 1

2. After sbt plugin install, restart intellij and Start New Scala sbt project (File -> New Project -> Scala -> SBT
Inline image 2
Inline image 3

3. Now open up the build.sbt file and add the all the dependencies(Here i'm adding spark 1.1.0 with hadoop 2.4.0 dependency)
Inline image 4

4. Now Create a new Scala class in src -> main -> scala and type in your code.
Inline image 6
5. Right click and hit Run :)
Inline image 7
Inline image 8

Let me know how it goes. You might want to restart intelliJ sometime to get the dependencies pulled from the build.sbt file.

Best Regards

On Mon, Nov 10, 2014 at 4:08 PM, MEETHU MATHEW <meethu2006@yahoo.co.in> wrote:

This question was asked  earlier  and I did it in the way specified..I am getting java.lang.ClassNotFoundException..

Can somebody explain all the steps required to build a spark app using IntelliJ (latest version)starting from creating the project to running it..I searched a lot but couldnt find an appropriate documentation..

Re: Is there a step-by-step instruction on how to build Spark App with IntelliJ IDEA?
Don’t try to use spark-core as an archetype. Instead just create a plain Scala project (no archetype) and add a Maven dependency on spark-core. That should be all you need.
Preview by Yahoo
Thanks & Regards,
Meethu M