spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Garrett Hamers <gcham...@umich.edu>
Subject Spark Import Issue
Date Fri, 06 Dec 2013 17:50:59 GMT
Hello,

I am new to the spark system, and I am trying to write a simple program to
get myself familiar with how spark works. I am currently having problem
with importing the spark package. I am getting the following compiler
error: package org.apache.spark.api.java does not exist.

I have spark-0.8.0-incubating install. I ran the commands: sbt/sbt compile,
sbt/sbt assembly, and sbt/sbt publish-local without any errors. My sql.java
file is located in the spark-0.8.0-incubating root directory. I tried to
compile the code using “javac sql.java” and “javac -cp
"assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.0-incubating*.jar"
sql.java”.

Here is the code for sql.java:

package shark;

import java.io.Serializable;

import java.util.List;

import java.io.*;

import org.apache.spark.api.java.*; //Issue is here

public class sql implements Serializable {

  public static void main( String[] args) {

    System.out.println("Hello World”);

  }

}


 What do I need to do in order for java to import the spark code properly?
Any advice would be greatly appreciated.

Thank you,
Garrett Hamers

Mime
View raw message