spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Aureliano Buendia <buendia...@gmail.com>
Subject Spark context jar confusions
Date Thu, 02 Jan 2014 10:40:46 GMT
Hi,

I do not understand why spark context has an option for loading jars at
runtime.

As an example, consider
this<https://github.com/apache/incubator-spark/blob/50fd8d98c00f7db6aa34183705c9269098c62486/examples/src/main/scala/org/apache/spark/examples/BroadcastTest.scala#L36>
:

object BroadcastTest {
  def main(args: Array[String]) {

  val sc = new SparkContext(args(0), "Broadcast Test",
      System.getenv("SPARK_HOME"), Seq(System.getenv("SPARK_EXAMPLES_JAR")))

}
}


This is *the* example, or *the* application that we want to run, what
does SPARK_EXAMPLES_JAR supposed to be?
In this particular case, the BroadcastTest example is self-contained,
why would it want to load other unrelated example jars?

Finally, how does this help a real world spark application?

Mime
View raw message