spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Xiangrui Meng <>
Subject enum-like types in Spark
Date Thu, 05 Mar 2015 01:11:01 GMT
Hi all,

There are many places where we use enum-like types in Spark, but in
different ways. Every approach has both pros and cons. I wonder
whether there should be an “official” approach for enum-like types in

1. Scala’s Enumeration (e.g., SchedulingMode, WorkerState, etc)

* All types show up as Enumeration.Value in Java.

2. Java’s Enum (e.g., SaveMode, IOMode)

* Implementation must be in a Java file.
* Values doesn’t show up in the ScalaDoc:

3. Static fields in Java (e.g., TripletFields)

* Implementation must be in a Java file.
* Doesn’t need “()” in Java code.
* Values don't show up in the ScalaDoc:

4. Objects in Scala. (e.g., StorageLevel)

* Needs “()” in Java code.
* Values show up in both ScalaDoc and JavaDoc:$

It would be great if we have an “official” approach for this as well
as the naming convention for enum-like values (“MEMORY_ONLY” or
“MemoryOnly”). Personally, I like 4) with “MEMORY_ONLY”. Any thoughts?


To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message