Hey Robert,
Probably a better question to ask over at user@spark.apache.org.
abase-common,jar would be the artifact you’d wanna put on the class path,
though.
-Dima
On Tue, Jul 5, 2016 at 3:39 PM, Robert James <srobertjames@gmail.com> wrote:
> I'm using spark-shell. The perplexing thing is that if I load it via
> spark-shell --jars, it seems to work. However, if I load it via
> spark.driver.extraClassPath in the config file, it seems to fail.
> What is the difference between --jars (command line) and
> spark.driver.extraClassPath (config)?
>
> On 7/5/16, Dima Spivak <dspivak@cloudera.com> wrote:
> > Hey Robert,
> >
> > HBaseConfiguration is part of the hbase-common module of the HBase
> project.
> > Are you using Maven to provide dependencies or just running java -cp?
> >
> > -Dima
> >
> > On Monday, July 4, 2016, Robert James <srobertjames@gmail.com> wrote:
> >
> >> When trying to load HBase via Spark, I get NoClassDefFoundError
> >> org/apache/hadoop/hbase/HBaseConfiguration errors.
> >>
> >> How do I provide that class to Spark?
> >>
> >
>
|