spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kefah Issa <ke...@freesoft.jo>
Subject RDD registerAsTable gives error on regular scala class records
Date Thu, 10 Jul 2014 12:39:04 GMT
Hi,

SQL on spark 1.0 is an interesting feature. It works fine when the "record"
is made of a case-class.

The issue I have is that I have around 50 attributes per record. scala
Case-class can not handle that (hard-coded limit is 22 for some reason). So
I created a regular class and defined the attributes in there.


// When running for the case-class I remove the "new"
val rdd = sc.textFile("myrecords.csv").map(line => new
Record(line.split(",")))

rdd.registerAsTable("records")


// This works
// case class Record(first:String, second:String, third:String)

// This causes the registerAsTable to fail
class Record (list:Array[String]) {
val first = list(0)
val second = list(1)
val third = list(3)
}


When compiling, I get the following error:

value registerAsTable is not a member of org.apache.spark.rdd.RDD[Record]

What I'm I missing here? or is SQL/Spark 1.0 only capable of dealing with
data set with 22-coloumns max?

Regards,
- Kefah.

Mime
View raw message