spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Haoming Zhang <haoming.zh...@outlook.com>
Subject RE: SparkSQL with sequence file RDDs
Date Tue, 08 Jul 2014 01:39:08 GMT
Hi Michael,

Thanks for the reply.

Actually last week I tried to play with Product interface, but I'm not really sure I did correct
or not. Here is what I did:

1. Created an abstract class A with Product interface, which has 20 parameters,
2. Created case class B extends A, and B has 20 parameters.

I can get all the parameters of A, and also B's parameters by productElement function, I just
curious is that possbile to convert this kind of case class to schema? Because I need to use
the .registerAsTable function to insert the case classes into table.

Best,
Haoming

From: michael@databricks.com
Date: Mon, 7 Jul 2014 17:52:34 -0700
Subject: Re: SparkSQL with sequence file RDDs
To: user@spark.apache.org



We know Scala 2.11 has remove the limitation of parameter number, but Spark 1.0 is not compatible
with it. So now we are considering use java beans instead of Scala case classes.



You can also manually create a class that implements scala's Product interface.  Finally,
SPARK-2179 will give you programatic non-classed based way to describe the schema.  Someone
is working on this now.

 		 	   		  
Mime
View raw message