spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sujit Pal <sujitatgt...@gmail.com>
Subject Re: Lemmatization using StanfordNLP in ML 2.0
Date Sun, 18 Sep 2016 21:21:19 GMT
Hi Janardhan,

Maybe try removing the string "test" from this line in your build.sbt?
IIRC, this restricts the models JAR to be called from a test.

    "edu.stanford.nlp" % "stanford-corenlp" % "3.6.0" % "test" classifier
"models",

-sujit


On Sun, Sep 18, 2016 at 11:01 AM, janardhan shetty <janardhanp22@gmail.com>
wrote:

> Hi,
>
> I am trying to use lemmatization as a transformer and added belwo to the
> build.sbt
>
>  "edu.stanford.nlp" % "stanford-corenlp" % "3.6.0",
>     "com.google.protobuf" % "protobuf-java" % "2.6.1",
>     "edu.stanford.nlp" % "stanford-corenlp" % "3.6.0" % "test" classifier
> "models",
>     "org.scalatest" %% "scalatest" % "2.2.6" % "test"
>
>
> Error:
> *Exception in thread "main" java.lang.NoClassDefFoundError:
> edu/stanford/nlp/pipeline/StanfordCoreNLP*
>
> I have tried other versions of this spark package.
>
> Any help is appreciated..
>

Mime
View raw message