I'm using spark-shell. The perplexing thing is that if I load it via
spark-shell --jars, it seems to work. However, if I load it via
spark.driver.extraClassPath in the config file, it seems to fail.
What is the difference between --jars (command line) and
spark.driver.extraClassPath (config)?
On 7/5/16, Dima Spivak <dspivak@cloudera.com> wrote:
> Hey Robert,
>
> HBaseConfiguration is part of the hbase-common module of the HBase project.
> Are you using Maven to provide dependencies or just running java -cp?
>
> -Dima
>
> On Monday, July 4, 2016, Robert James <srobertjames@gmail.com> wrote:
>
>> When trying to load HBase via Spark, I get NoClassDefFoundError
>> org/apache/hadoop/hbase/HBaseConfiguration errors.
>>
>> How do I provide that class to Spark?
>>
>
|