spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mark Hamstra <m...@clearstorydata.com>
Subject Re: Timezone Conversion Utilities
Date Wed, 04 Sep 2013 16:48:07 GMT
I'm not saying that users should add their own dependencies to
SparkBuild.scala and build their own custom version of Spark.  What I am
saying is that I don't see any reason for us to bind users to a particular
library by linking it directly into Spark when they can very easily include
in their own project's build configuration and imports whichever library
best fits their particular needs.

We're trying to build a framework that users will then build upon.  That
means that we are specifically not choosing to make as many design
decisions as possible for them in order to meet some notion of maximum user
convenience, because in doing so when there isn't a clear and
uncontroversial choice of third-party libraries, we inevitably end up
inconveniencing some users whose needs we did not fully anticipate -- and
we often end up inconveniencing ourselves by committing to particular,
immature libraries only to have those commitments look unwise in the future
as libraries evolve, change names and transitive dependencies, etc.


On Wed, Sep 4, 2013 at 9:35 AM, Gary Malouf <malouf.gary@gmail.com> wrote:

> To be clear, I was not referring to the Spark team rolling their own
> Date/Time utilities.  In general though, I strongly disagree with just
> adding your own personal project dependencies to SparkBuild.scala - this
> quickly creates confusion and maintainability issues when one looks at
> upgrading.  It appears we will just have to deal with ADD_JARS property for
> the foreseeable future.
>
>
> On Wed, Sep 4, 2013 at 12:17 PM, Mark Hamstra <mark@clearstorydata.com>wrote:
>
>> A couple of lines to include a build dependency and import a library vs.
>> all of the time to develop and maintain our own time-and-date code or all
>> of the user headache of having to work-around our choice the link in a
>> library that doesn't fit their particular needs.
>>
>> Until there is an obvious, stable and expected-in-almost-all-cases
>> third-party time-and-date library to chose, I strongly urge that we do not
>> bind Spark to a particular time-and-date library.  (And there are a lot
>> better things that we could be doing with our time than developing on our
>> own yet another time-and-date implementation.)
>>
>>
>> On Wed, Sep 4, 2013 at 7:45 AM, Gary Malouf <malouf.gary@gmail.com>wrote:
>>
>>> More setup that a user needs to do to reach his functional goals.
>>>
>>>
>>> On Wed, Sep 4, 2013 at 9:40 AM, Mark Hamstra <mark@clearstorydata.com>wrote:
>>>
>>>> Why?  What is wrong with using the extant libraries?
>>>>
>>>>
>>>>
>>>> On Wed, Sep 4, 2013 at 6:37 AM, Gary Malouf <malouf.gary@gmail.com>wrote:
>>>>
>>>>> Are there any built-in functions for timezone conversions?  I can
>>>>> obviously bring in NScalaTime and other external libraries. However,
being
>>>>> that this is probably a common need across companies I feel like it would
>>>>> make more sense to provide this out of the box.
>>>>>
>>>>
>>>>
>>>
>>
>

Mime
View raw message