spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 尹绪森 <yinxu...@gmail.com>
Subject Re: Non-interactive job fails to copy local variable to remote machines
Date Wed, 29 Jan 2014 22:54:37 GMT
Could you give some more details? e.g. the context of your code and the
exception stack trace.

Your code seems weired, do you have already new a SparkContext ? REPL will
add some necessary components while application would not.
2014-1-30 AM4:35于 "Michael Diamant" <diamant.michael@gmail.com>写道:

> My team recently began writing Spark jobs to be deployed to a Spark
> cluster in the form of a jar.  Previously, my team interacted with Spark
> via the REPL.  The job in question works within the REPL, but fails when
> executed non-interactively (i.e. packaged as a jar).
>
> The job code looks similar to:
> // imports
> object Runner extends App {
>   val end = new DateTime()
>   // additional setup
>   someRdd.filter(f => end.isAfter(f.date)
> }
>
> The point of this example is that a value, end, is defined local to the
> driver.  Later in the program's execution, the locally defined value, end,
> is referenced in the filter predicate of an RDD.  When running
> non-interactively, a NPE occurs when 'end' is referenced in the filter
> predicate.  However, running the exact same code via the REPL executes
> successfully.
>
> Spark environment details are:
> Spark version:  v0.9 using commit SHA
> e2ebc3a9d8bca83bf842b134f2f056c1af0ad2be
> Scala version: v2.9.3
>
> I appreciate any help in identifying bugs/mistakes made.
>
> Thank you,
> Michael
>

Mime
View raw message