[ https://issues.apache.org/jira/browse/SPARK-15289?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Andrew Or resolved SPARK-15289.
-------------------------------
Resolution: Fixed
Fix Version/s: 2.0.0
> SQL test compilation error from merge conflict
> ----------------------------------------------
>
> Key: SPARK-15289
> URL: https://issues.apache.org/jira/browse/SPARK-15289
> Project: Spark
> Issue Type: Bug
> Components: Build, SQL
> Affects Versions: 2.0.0
> Reporter: Piotr Milanowski
> Assignee: Andrew Or
> Priority: Blocker
> Fix For: 2.0.0
>
>
> Spark build fails during SQL build. Concerns commit 6b69b8c0c778f4cba2b281fe3ad225dc922f82d6,
but also earlier ones; build works e.g. for commit c6d23b6604e85bcddbd1fb6a2c1c3edbfd2be2c1.
> Run with command:
> ./dev/make-distribution.sh -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver -Dhadoop.version=2.6.0
-DskipTests
> Result:
> {code}
> [error] /home/bpol0421/various/spark/sql/core/src/test/scala/org/apache/spark/sql/internal/CatalogSuite.scala:282:
not found: value sparkSession
> [error] val dbString = CatalogImpl.makeDataset(Seq(db), sparkSession).showString(10)
> [error] ^
> [error] /home/bpol0421/various/spark/sql/core/src/test/scala/org/apache/spark/sql/internal/CatalogSuite.scala:283:
not found: value sparkSession
> [error] val tableString = CatalogImpl.makeDataset(Seq(table), sparkSession).showString(10)
> [error] ^
> [error] /home/bpol0421/various/spark/sql/core/src/test/scala/org/apache/spark/sql/internal/CatalogSuite.scala:284:
not found: value sparkSession
> [error] val functionString = CatalogImpl.makeDataset(Seq(function), sparkSession).showString(10)
> [error] ^
> [error] /home/bpol0421/various/spark/sql/core/src/test/scala/org/apache/spark/sql/internal/CatalogSuite.scala:285:
not found: value sparkSession
> [error] val columnString = CatalogImpl.makeDataset(Seq(column), sparkSession).showString(10)
> [error] ^
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org
|