spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alessandro Baretta <alexbare...@gmail.com>
Subject Re: Where are the docs for the SparkSQL DataTypes?
Date Fri, 12 Dec 2014 02:45:39 GMT
Thanks. This is useful.

Alex

On Thu, Dec 11, 2014 at 4:35 PM, Cheng, Hao <hao.cheng@intel.com> wrote:
>
> Part of it can be found at:
>
> https://github.com/apache/spark/pull/3429/files#diff-f88c3e731fcb17b1323b778807c35b38R34
>
> Sorry it's a TO BE reviewed PR, but still should be informative.
>
> Cheng Hao
>
> -----Original Message-----
> From: Alessandro Baretta [mailto:alexbaretta@gmail.com]
> Sent: Friday, December 12, 2014 6:37 AM
> To: Michael Armbrust; dev@spark.apache.org
> Subject: Where are the docs for the SparkSQL DataTypes?
>
> Michael & other Spark SQL junkies,
>
> As I read through the Spark API docs, in particular those for the
> org.apache.spark.sql package, I can't seem to find details about the Scala
> classes representing the various SparkSQL DataTypes, for instance
> DecimalType. I find DataType classes in org.apache.spark.sql.api.java, but
> they don't seem to match the similarly named scala classes. For instance,
> DecimalType is documented as having a nullary constructor, but if I try to
> construct an instance of org.apache.spark.sql.DecimalType without any
> parameters, the compiler complains about the lack of a precisionInfo field,
> which I have discovered can be passed in as None. Where is all this stuff
> documented?
>
> Alex
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message