spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michael Armbrust <mich...@databricks.com>
Subject Re: shutdown local hivecontext?
Date Mon, 03 Aug 2015 22:56:19 GMT
TestHive takes care of creating a temporary directory for each invocation
so that multiple test runs won't conflict.

On Mon, Aug 3, 2015 at 3:09 PM, Cesar Flores <cesar7@gmail.com> wrote:

>
> We are using a local hive context in order to run unit tests. Our unit
> tests runs perfectly fine if we run why by one using sbt as the next
> example:
>
> >sbt test-only com.company.pipeline.scalers.ScalerSuite.scala
> >sbt test-only com.company.pipeline.labels.ActiveUsersLabelsSuite.scala
>
> However, if we try to run them as:
>
> >sbt test-only com.company.pipeline.*
>
> we start to run into issues. It appears that the issue is that the hive
> context is not properly shutdown after finishing the first test. Does any
> one know how to attack this problem? The test part in my build.sbt file
> looks like:
>
> libraryDependencies += "org.scalatest" % "scalatest_2.10" % "2.0" % "test",
> parallelExecution in Test := false,
> fork := true,
> javaOptions ++= Seq("-Xms512M", "-Xmx2048M", "-XX:MaxPermSize=2048M",
> "-XX:+CMSClassUnloadingEnabled")
>
> We are working under Spark 1.3.0
>
>
> Thanks
> --
> Cesar Flores
>

Mime
View raw message