spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Imran Rashid <iras...@cloudera.com>
Subject Re: Running Spark from Scala source files other than main file
Date Thu, 12 Mar 2015 00:58:30 GMT
did you forget to specify the main class w/ "--class Main"?  though if that
was it, you should at least see *some* error message, so I'm confused
myself ...

On Wed, Mar 11, 2015 at 6:53 AM, Aung Kyaw Htet <akhtet@gmail.com> wrote:

> Hi Everyone,
>
> I am developing a scala app, in which the main object does not call the
> SparkContext, but another object defined in the same package creates it,
> run spark operations and closes it. The jar file is built successfully in
> maven, but when I called spark-submit with this jar, that spark does not
> seem to execute any code.
>
> So my code looks like
>
> [Main.scala]
>
> object Main(args) {
>   def main() {
>     /*check parameters */
>      Component1.start(parameters)
>     }
>   }
>
> [Component1.scala]
>
> object Component1{
>   def start{
>    val sc = new SparkContext(conf)
>    /* do spark operations */
>    sc.close()
>   }
> }
>
> The above code compiles into Main.jar but spark-submit does not execute
> anything and does not show me any error (not in the logs either.)
>
> spark-submit master= spark://.... Main.jar
>
> I've got this all the code working before when I wrote a single scala
> file, but now that I am separating into multiple scala source files,
> something isn't running right.
>
> Any advice on this would be greatly appreciated!
>
> Regards,
> Aung
>

Mime
View raw message