spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Aris <arisofala...@gmail.com>
Subject Possible Code Generation Bug: Can Spark 2.0 Datasets handle Scala Value Classes?
Date Thu, 01 Sep 2016 20:58:26 GMT
Hello Spark community -

Does Spark 2.0 Datasets *not support* Scala Value classes (basically
"extends AnyVal" with a bunch of limitations) ?

I am trying to do something like this:

case class FeatureId(value: Int) extends AnyVal
val seq = Seq(FeatureId(1),FeatureId(2),FeatureId(3))
import spark.implicits._
val ds = spark.createDataset(seq)
ds.count


This will compile, but then it will break at runtime with a cryptic error
about "cannot find int at value". If I remove the "extends AnyVal" part,
then everything works.

Value classes are a great performance boost / static type checking feature
in Scala, but are they prohibited in Spark Datasets?

Thanks!

Mime
View raw message