spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matei Zaharia <>
Subject Re: Spark Import Issue
Date Sun, 08 Dec 2013 08:25:23 GMT
I’m not sure you can have a star inside that quoted classpath argument (the double quotes
may cancel the *). Try using the JAR through its full name, or link to Spark through Maven


On Dec 6, 2013, at 9:50 AM, Garrett Hamers <> wrote:

> Hello,
> I am new to the spark system, and I am trying to write a simple program to get myself
familiar with how spark works. I am currently having problem with importing the spark package.
I am getting the following compiler error: package does not exist.

> I have spark-0.8.0-incubating install. I ran the commands: sbt/sbt compile, sbt/sbt assembly,
and sbt/sbt publish-local without any errors. My file is located in the spark-0.8.0-incubating
root directory. I tried to compile the code using “javac” and “javac -cp "assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.0-incubating*.jar"”.
> Here is the code for
> package shark;
> import;
> import java.util.List;
> import*;
> import*; //Issue is here
> public class sql implements Serializable { 
>   public static void main( String[] args) {
>     System.out.println("Hello World”);
>   }
> }
> What do I need to do in order for java to import the spark code properly? Any advice
would be greatly appreciated.
> Thank you,
> Garrett Hamers

View raw message