spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marcelo Vanzin <van...@cloudera.com>
Subject Re: Using Spark Context as an attribute of a class cannot be used
Date Mon, 24 Nov 2014 21:30:36 GMT
Hello,

On Mon, Nov 24, 2014 at 12:07 PM, aecc <alessandroaecc@gmail.com> wrote:
> This is the stacktrace:
>
> org.apache.spark.SparkException: Job aborted due to stage failure: Task not
> serializable: java.io.NotSerializableException: $iwC$$iwC$$iwC$$iwC$AAA
>         - field (class "$iwC$$iwC$$iwC$$iwC", name: "aaa", type: "class
> $iwC$$iwC$$iwC$$iwC$AAA")

Ah. Looks to me that you're trying to run this in spark-shell, right?

I'm not 100% sure of how it works internally, but I think the Scala
repl works a little differently than regular Scala code in this
regard. When you declare a "val" in the shell it will behave
differently than a "val" inside a method in a compiled Scala class -
the former will behave like an instance variable, the latter like a
local variable. So, this is probably why you're running into this.

Try compiling your code and running it outside the shell to see how it
goes. I'm not sure whether there's a workaround for this when trying
things out in the shell - maybe declare an `object` to hold your
constants? Never really tried, so YMMV.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message