spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gary Malouf <>
Subject Re: Timezone Conversion Utilities
Date Wed, 04 Sep 2013 16:35:09 GMT
To be clear, I was not referring to the Spark team rolling their own
Date/Time utilities.  In general though, I strongly disagree with just
adding your own personal project dependencies to SparkBuild.scala - this
quickly creates confusion and maintainability issues when one looks at
upgrading.  It appears we will just have to deal with ADD_JARS property for
the foreseeable future.

On Wed, Sep 4, 2013 at 12:17 PM, Mark Hamstra <>wrote:

> A couple of lines to include a build dependency and import a library vs.
> all of the time to develop and maintain our own time-and-date code or all
> of the user headache of having to work-around our choice the link in a
> library that doesn't fit their particular needs.
> Until there is an obvious, stable and expected-in-almost-all-cases
> third-party time-and-date library to chose, I strongly urge that we do not
> bind Spark to a particular time-and-date library.  (And there are a lot
> better things that we could be doing with our time than developing on our
> own yet another time-and-date implementation.)
> On Wed, Sep 4, 2013 at 7:45 AM, Gary Malouf <> wrote:
>> More setup that a user needs to do to reach his functional goals.
>> On Wed, Sep 4, 2013 at 9:40 AM, Mark Hamstra <>wrote:
>>> Why?  What is wrong with using the extant libraries?
>>> On Wed, Sep 4, 2013 at 6:37 AM, Gary Malouf <>wrote:
>>>> Are there any built-in functions for timezone conversions?  I can
>>>> obviously bring in NScalaTime and other external libraries. However, being
>>>> that this is probably a common need across companies I feel like it would
>>>> make more sense to provide this out of the box.

View raw message