spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Khaja Mohideen <kha...@gmail.com>
Subject Re: Setting up Spark 1.1 on Windows 7
Date Sun, 21 Sep 2014 15:40:26 GMT
Setting java_opts helped me fix the problem.

Thanks,
-Khaja

On Sun, Sep 21, 2014 at 9:25 AM, Khaja Mohideen <khajam@gmail.com> wrote:

> I was able to move past this error by deleting the .ivy2/cache folder.
>
> However, I am running into an out of memory error
> [error] java.util.concurrent.ExecutionException:
> java.lang.OutOfMemoryError: Jav
> a heap space
> [error] Use 'last' for the full log.
>
> This is despite the fact that I have set m2_opts like this:
> -Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m
>
> Am I missng something?
>
> thanks,
> -Khaja
>
> On Sun, Sep 21, 2014 at 7:39 AM, Khaja M <khajam@gmail.com> wrote:
>
>> Hi:
>>
>> I am trying to setup Spark 1.1 on a Windows 7 box and I am running the sbt
>> assembly command and this is the error that I am seeing.
>>
>> [error] (streaming-flume-sink/*:update) sbt.ResolveException: unresolved
>> depende
>> ncy: commons-lang#commons-lang;2.6: configuration not found in
>> commons-lang#comm
>> ons-lang;2.6: 'compile'. It was required from
>> org.apache.avro#avro-compiler;1.7.
>> 3 compile
>> [error] (core/*:update) sbt.ResolveException: unresolved dependency:
>> org.slf4j#s
>> lf4j-api;1.7.5: configuration not found in org.slf4j#slf4j-api;1.7.5:
>> 'master'.
>> It was required from org.apache.spark#spark-core_2.10;1.1.0 compile
>> [error] Total time: 10 s, completed Sep 21, 2014 7:13:16 AM
>>
>> I looked at my local maven repo and found that I do have both the
>> dependencies there:
>> 1. commons-lang 2.6
>> 2. org.slf4j-api 1.7.5
>>
>> Any ideas on what I am missing?
>> Thanks,
>> -Khaja
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Setting-up-Spark-1-1-on-Windows-7-tp14759.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>

Mime
View raw message