spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Aaron Davidson <ilike...@gmail.com>
Subject Re: enum-like types in Spark
Date Thu, 05 Mar 2015 07:45:01 GMT
That's kinda annoying, but it's just a little extra boilerplate. Can you
call it as StorageLevel.DiskOnly() from Java? Would it also work if they
were case classes with empty constructors, without the field?

On Wed, Mar 4, 2015 at 11:35 PM, Xiangrui Meng <mengxr@gmail.com> wrote:

> `case object` inside an `object` doesn't show up in Java. This is the
> minimal code I found to make everything show up correctly in both
> Scala and Java:
>
> sealed abstract class StorageLevel // cannot be a trait
>
> object StorageLevel {
>   private[this] case object _MemoryOnly extends StorageLevel
>   final val MemoryOnly: StorageLevel = _MemoryOnly
>
>   private[this] case object _DiskOnly extends StorageLevel
>   final val DiskOnly: StorageLevel = _DiskOnly
> }
>
> On Wed, Mar 4, 2015 at 8:10 PM, Patrick Wendell <pwendell@gmail.com>
> wrote:
> > I like #4 as well and agree with Aaron's suggestion.
> >
> > - Patrick
> >
> > On Wed, Mar 4, 2015 at 6:07 PM, Aaron Davidson <ilikerps@gmail.com>
> wrote:
> >> I'm cool with #4 as well, but make sure we dictate that the values
> should
> >> be defined within an object with the same name as the enumeration (like
> we
> >> do for StorageLevel). Otherwise we may pollute a higher namespace.
> >>
> >> e.g. we SHOULD do:
> >>
> >> trait StorageLevel
> >> object StorageLevel {
> >>   case object MemoryOnly extends StorageLevel
> >>   case object DiskOnly extends StorageLevel
> >> }
> >>
> >> On Wed, Mar 4, 2015 at 5:37 PM, Michael Armbrust <
> michael@databricks.com>
> >> wrote:
> >>
> >>> #4 with a preference for CamelCaseEnums
> >>>
> >>> On Wed, Mar 4, 2015 at 5:29 PM, Joseph Bradley <joseph@databricks.com>
> >>> wrote:
> >>>
> >>> > another vote for #4
> >>> > People are already used to adding "()" in Java.
> >>> >
> >>> >
> >>> > On Wed, Mar 4, 2015 at 5:14 PM, Stephen Boesch <javadba@gmail.com>
> >>> wrote:
> >>> >
> >>> > > #4 but with MemoryOnly (more scala-like)
> >>> > >
> >>> > > http://docs.scala-lang.org/style/naming-conventions.html
> >>> > >
> >>> > > Constants, Values, Variable and Methods
> >>> > >
> >>> > > Constant names should be in upper camel case. That is, if the
> member is
> >>> > > final, immutable and it belongs to a package object or an object,
> it
> >>> may
> >>> > be
> >>> > > considered a constant (similar to Java'sstatic final members):
> >>> > >
> >>> > >
> >>> > >    1. object Container {
> >>> > >    2.     val MyConstant = ...
> >>> > >    3. }
> >>> > >
> >>> > >
> >>> > > 2015-03-04 17:11 GMT-08:00 Xiangrui Meng <mengxr@gmail.com>:
> >>> > >
> >>> > > > Hi all,
> >>> > > >
> >>> > > > There are many places where we use enum-like types in Spark,
but
> in
> >>> > > > different ways. Every approach has both pros and cons. I
wonder
> >>> > > > whether there should be an "official" approach for enum-like
> types in
> >>> > > > Spark.
> >>> > > >
> >>> > > > 1. Scala's Enumeration (e.g., SchedulingMode, WorkerState,
etc)
> >>> > > >
> >>> > > > * All types show up as Enumeration.Value in Java.
> >>> > > >
> >>> > > >
> >>> > >
> >>> >
> >>>
> http://spark.apache.org/docs/latest/api/java/org/apache/spark/scheduler/SchedulingMode.html
> >>> > > >
> >>> > > > 2. Java's Enum (e.g., SaveMode, IOMode)
> >>> > > >
> >>> > > > * Implementation must be in a Java file.
> >>> > > > * Values doesn't show up in the ScalaDoc:
> >>> > > >
> >>> > > >
> >>> > >
> >>> >
> >>>
> http://spark.apache.org/docs/latest/api/scala/#org.apache.spark.network.util.IOMode
> >>> > > >
> >>> > > > 3. Static fields in Java (e.g., TripletFields)
> >>> > > >
> >>> > > > * Implementation must be in a Java file.
> >>> > > > * Doesn't need "()" in Java code.
> >>> > > > * Values don't show up in the ScalaDoc:
> >>> > > >
> >>> > > >
> >>> > >
> >>> >
> >>>
> http://spark.apache.org/docs/latest/api/scala/#org.apache.spark.graphx.TripletFields
> >>> > > >
> >>> > > > 4. Objects in Scala. (e.g., StorageLevel)
> >>> > > >
> >>> > > > * Needs "()" in Java code.
> >>> > > > * Values show up in both ScalaDoc and JavaDoc:
> >>> > > >
> >>> > > >
> >>> > >
> >>> >
> >>>
> http://spark.apache.org/docs/latest/api/scala/#org.apache.spark.storage.StorageLevel$
> >>> > > >
> >>> > > >
> >>> > >
> >>> >
> >>>
> http://spark.apache.org/docs/latest/api/java/org/apache/spark/storage/StorageLevel.html
> >>> > > >
> >>> > > > It would be great if we have an "official" approach for this
as
> well
> >>> > > > as the naming convention for enum-like values ("MEMORY_ONLY"
or
> >>> > > > "MemoryOnly"). Personally, I like 4) with "MEMORY_ONLY".
Any
> >>> thoughts?
> >>> > > >
> >>> > > > Best,
> >>> > > > Xiangrui
> >>> > > >
> >>> > > >
> ---------------------------------------------------------------------
> >>> > > > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> >>> > > > For additional commands, e-mail: dev-help@spark.apache.org
> >>> > > >
> >>> > > >
> >>> > >
> >>> >
> >>>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message