spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shivani Rao <raoshiv...@gmail.com>
Subject Re: Adding external jar to spark-shell classpath in spark 1.0
Date Thu, 12 Jun 2014 17:22:56 GMT
@Marcelo:  The command ./bin/spark-shell --jars jar1,jar2,etc,etc did not
work for me on a linux machine

What I did is to append the class path in the bin/compute-classpath.sh
file. Ran the script, then started the spark shell, and that worked


Thanks
Shivani


On Wed, Jun 11, 2014 at 10:52 AM, Andrew Or <andrew@databricks.com> wrote:

> Ah, of course, there are no application jars in spark-shell, then it seems
> that there are no workarounds for this at the moment. We will look into a
> fix shortly, but for now you will have to create an application and use
> spark-submit (or use spark-shell on Linux).
>
>
> 2014-06-11 10:42 GMT-07:00 Ulanov, Alexander <alexander.ulanov@hp.com>:
>
>   Could you elaborate on this? I don’t have an application, I just use
>> spark shell.
>>
>>
>>
>> *From:* Andrew Or [mailto:andrew@databricks.com]
>> *Sent:* Wednesday, June 11, 2014 9:40 PM
>>
>> *To:* user@spark.apache.org
>> *Subject:* Re: Adding external jar to spark-shell classpath in spark 1.0
>>
>>
>>
>> This is a known issue: https://issues.apache.org/jira/browse/SPARK-1919.
>> We haven't found a fix yet, but for now, you can workaround this by
>> including your simple class in your application jar.
>>
>>
>>
>> 2014-06-11 10:25 GMT-07:00 Ulanov, Alexander <alexander.ulanov@hp.com>:
>>
>>  Hi,
>>
>>
>>
>> I am currently using spark 1.0 locally on Windows 7. I would like to use
>> classes from external jar in the spark-shell. I followed the instruction
>> in:
>> http://mail-archives.apache.org/mod_mbox/spark-user/201402.mbox/%3CCALrNVjWWF6k=c_JrhoE9W_qaACJLD4+kbdUhfV0Pitr8H1fXnw@mail.gmail.com%3E
>>
>>
>>
>> I have set ADD_JARS=”my.jar” SPARK_CLASSPATH=”my.jar” in spark-shell.cmd
>> but this didn’t work.
>>
>>
>>
>> I also tried running “spark-shell.cmd --jars my.jar --driver-class-path
>> my.jar --driver-library-path my.jar” and it didn’t work either.
>>
>>
>>
>> I cannot load any class from my jar into spark shell. Btw my.jar contains
>> a simple Scala class.
>>
>>
>>
>> Best regards, Alexander
>>
>>
>>
>
>


-- 
Software Engineer
Analytics Engineering Team@ Box
Mountain View, CA

Mime
View raw message