spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From shane knapp <skn...@berkeley.edu>
Subject Re: Ivy support in Spark vs. sbt
Date Thu, 04 Jun 2015 17:23:55 GMT
interesting...  i definitely haven't seen it happen that often in our build
system, and when it has happened, i wasn't able to determine the cause.

On Thu, Jun 4, 2015 at 10:16 AM, Marcelo Vanzin <vanzin@cloudera.com> wrote:

> On Thu, Jun 4, 2015 at 10:04 AM, shane knapp <sknapp@berkeley.edu> wrote:
>
>> this has occasionally happened on our jenkins as well (twice since last
>> august), and deleting the cache fixes it right up.
>>
>
> Yes deleting the cache fixes things, but it's kinda annoying to have to do
> that. And yesterday when I was testing a patch that actually used the ivy
> feature, I had to do that multiple times... that slows things down a lot.
>
>
>>
>> On Thu, Jun 4, 2015 at 4:29 AM, Sean Owen <sowen@cloudera.com> wrote:
>>
>>> I've definitely seen the "dependency path must be relative" problem,
>>> and fixed it by deleting the ivy cache, but I don't know more than
>>> this.
>>>
>>> On Thu, Jun 4, 2015 at 1:33 AM, Marcelo Vanzin <vanzin@cloudera.com>
>>> wrote:
>>> > Hey all,
>>> >
>>> > I've been bit by something really weird lately and I'm starting to
>>> think
>>> > it's related to the ivy support we have in Spark, and running unit
>>> tests
>>> > that use that code.
>>> >
>>> > The first thing that happens is that after running unit tests,
>>> sometimes my
>>> > sbt builds start failing with error saying something about "dependency
>>> path
>>> > must be relative" (sorry, don't have the exact error around). The
>>> dependency
>>> > path it prints is a "file:" URL.
>>> >
>>> > I have a feeling that this is because Spark uses Ivy 2.4 while sbt
>>> uses Ivy
>>> > 2.3, and those might be incompatible. So if they get mixed up, things
>>> can
>>> > break.
>>> >
>>> > The second is that sometimes unit tests fail with some weird error
>>> > downloading dependencies. When checking the ivy metadata in
>>> ~/.ivy2/cache,
>>> > the offending dependencies are pointing to my local maven repo (I have
>>> > "maven-local" as one of the entries in my ~/.sbt/repositories).
>>> >
>>> > My feeling in this case is that Spark's version of Ivy somehow doesn't
>>> > handle that case.
>>> >
>>> > So, long story short:
>>> >
>>> > - Has anyone run into either of these problems?
>>> > - Is it possible to set some env variable or something during tests to
>>> force
>>> > them to use their own directory instead of messing up and breaking my
>>> > ~/.ivy2?
>>> >
>>> >
>>> > --
>>> > Marcelo
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>
>>>
>>
>
>
> --
> Marcelo
>

Mime
View raw message