spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gary Malouf <malouf.g...@gmail.com>
Subject Re: Timezone Conversion Utilities
Date Thu, 05 Sep 2013 16:51:00 GMT
Yup, we are making that work.


On Wed, Sep 4, 2013 at 9:19 PM, Matei Zaharia <matei.zaharia@gmail.com>wrote:

> Ah, got it. You can do one other thing to shorten the list: package your
> application into a single "assembly JAR". For SBT you can use this plugin:
> https://github.com/sbt/sbt-assembly or for Maven use this:
> http://maven.apache.org/plugins/maven-shade-plugin/.
>
> Matei
>
> On Sep 4, 2013, at 10:21 AM, Gary Malouf <malouf.gary@gmail.com> wrote:
>
> That's how I do it now, list is getting lengthy but we are automating the
> retrieving of the jars and list build up in ansible.
>
>
> On Wed, Sep 4, 2013 at 12:55 PM, Matei Zaharia <matei.zaharia@gmail.com>wrote:
>
>> Hi Gary,
>>
>> Just to be clear, if you want to use third-party libraries in Spark (or
>> even your own code), you *don't* need to modify SparkBuild.scala. Just pass
>> a list of JARs containing your dependencies when you create your
>> SparkContext. See
>> http://spark.incubator.apache.org/docs/latest/quick-start.html for
>> details. Spark will automatically ship those JARs to worker nodes and put
>> them on the classpath for just this job.
>>
>> Matei
>>
>> On Sep 4, 2013, at 9:35 AM, Gary Malouf <malouf.gary@gmail.com> wrote:
>>
>> To be clear, I was not referring to the Spark team rolling their own
>> Date/Time utilities.  In general though, I strongly disagree with just
>> adding your own personal project dependencies to SparkBuild.scala - this
>> quickly creates confusion and maintainability issues when one looks at
>> upgrading.  It appears we will just have to deal with ADD_JARS property for
>> the foreseeable future.
>>
>>
>> On Wed, Sep 4, 2013 at 12:17 PM, Mark Hamstra <mark@clearstorydata.com>wrote:
>>
>>> A couple of lines to include a build dependency and import a library vs.
>>> all of the time to develop and maintain our own time-and-date code or all
>>> of the user headache of having to work-around our choice the link in a
>>> library that doesn't fit their particular needs.
>>>
>>> Until there is an obvious, stable and expected-in-almost-all-cases
>>> third-party time-and-date library to chose, I strongly urge that we do not
>>> bind Spark to a particular time-and-date library.  (And there are a lot
>>> better things that we could be doing with our time than developing on our
>>> own yet another time-and-date implementation.)
>>>
>>>
>>> On Wed, Sep 4, 2013 at 7:45 AM, Gary Malouf <malouf.gary@gmail.com>wrote:
>>>
>>>> More setup that a user needs to do to reach his functional goals.
>>>>
>>>>
>>>> On Wed, Sep 4, 2013 at 9:40 AM, Mark Hamstra <mark@clearstorydata.com>wrote:
>>>>
>>>>> Why?  What is wrong with using the extant libraries?
>>>>>
>>>>>
>>>>>
>>>>> On Wed, Sep 4, 2013 at 6:37 AM, Gary Malouf <malouf.gary@gmail.com>wrote:
>>>>>
>>>>>> Are there any built-in functions for timezone conversions?  I can
>>>>>> obviously bring in NScalaTime and other external libraries. However,
being
>>>>>> that this is probably a common need across companies I feel like
it would
>>>>>> make more sense to provide this out of the box.
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>>
>
>

Mime
View raw message