spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marcelo Vanzin <van...@cloudera.com>
Subject Re: Adding external jar to spark-shell classpath in spark 1.0
Date Wed, 11 Jun 2014 17:35:52 GMT
Ah, not that it should matter, but I'm on Linux and you seem to be on
Windows... maybe there is something weird going on with the Windows
launcher?

On Wed, Jun 11, 2014 at 10:34 AM, Marcelo Vanzin <vanzin@cloudera.com> wrote:
> Just tried this and it worked fine for me:
>
> ./bin/spark-shell --jars jar1,jar2,etc,etc
>
> On Wed, Jun 11, 2014 at 10:25 AM, Ulanov, Alexander
> <alexander.ulanov@hp.com> wrote:
>> Hi,
>>
>>
>>
>> I am currently using spark 1.0 locally on Windows 7. I would like to use
>> classes from external jar in the spark-shell. I followed the instruction in:
>> http://mail-archives.apache.org/mod_mbox/spark-user/201402.mbox/%3CCALrNVjWWF6k=c_JrhoE9W_qaACJLD4+kbdUhfV0Pitr8H1fXnw@mail.gmail.com%3E
>>
>>
>>
>> I have set ADD_JARS=”my.jar” SPARK_CLASSPATH=”my.jar” in spark-shell.cmd
but
>> this didn’t work.
>>
>>
>>
>> I also tried running “spark-shell.cmd --jars my.jar --driver-class-path
>> my.jar --driver-library-path my.jar” and it didn’t work either.
>>
>>
>>
>> I cannot load any class from my jar into spark shell. Btw my.jar contains a
>> simple Scala class.
>>
>>
>>
>> Best regards, Alexander
>
>
>
> --
> Marcelo



-- 
Marcelo

Mime
View raw message