spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Soren Macbeth <>
Subject Re: Exception in thread "DAGScheduler" scala.MatchError: None (of class scala.None$)
Date Thu, 16 Jan 2014 19:18:23 GMT
Yeah, sorry, I understand match errors, just not what was causing it in this case. 

On January 16, 2014 at 11:12:39 AM, Mark Hamstra ( wrote:

I'm glad you found a fix.  I was strongly suspecting that the problem had to reside in your
Clojure function and how it was passed to Spark, but I didn't have time to nail that down

When you say that you don't understand the exception, do you mean that you don't understand
match errors generally, or just that you don't understand what caused this one in particular?
 In general, match errors occur when there aren't cases to handle all the possible pattern
matches and one of the unhandled cases actual does match.  So, when pattern matching on an
Option, that would look something like this:

anOption match {
  case Some(x) => doSomethingWith(x)

...and then trying to evaluate this when anOption is None.  To see why this kind of match
error is occurring in your particular case, we'd have to look more closely at what your Clojure
code was doing. 

On Thu, Jan 16, 2014 at 10:52 AM, Soren Macbeth <> wrote:
FWIW, I fixed this. It was related to how I was serializing my clojure functions, although
I'm still not sure exactly what the cause of the exception was. 

On Wed, Jan 15, 2014 at 3:05 PM, Soren Macbeth <> wrote:
I'm working on a Clojure DSL, so my map and reduce function are in Clojure, but I updated
to the gist to include the code.

(map-reduce-1) works as expected, however, (map-reduce) throws that exception. I've traced
the types and outputs along the way and every is identical form what I can tell. (defsparkfn)
uses (sparkop) under the hood as well, so that code is essentially identical, which has my
scratching my head. 

On Wed, Jan 15, 2014 at 2:56 PM, Mark Hamstra <> wrote:
Okay, that fits with what I was expecting.

What does your reduce function look like?

On Wed, Jan 15, 2014 at 2:33 PM, Soren Macbeth <> wrote:
0.8.1-incubating running locally.

On January 15, 2014 at 2:28:00 PM, Mark Hamstra ( wrote:

Spark version?

On Wed, Jan 15, 2014 at 2:19 PM, Soren Macbeth <> wrote:

I'm having some trouble understanding what this exception means, i.e., what the problem it's
complaining about is. The full stack trace is here:

I've doing a simple map and then reduce.


View raw message