spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Todd <bit1...@163.com>
Subject About Databricks's spark-sql-perf
Date Thu, 13 Aug 2015 13:49:07 GMT
Hi,
I got a question about the spark-sql-perf project by Databricks at https://github.com/databricks/spark-sql-perf/

The Tables.scala (https://github.com/databricks/spark-sql-perf/blob/master/src/main/scala/com/databricks/spark/sql/perf/bigdata/Tables.scala)
and BigData (https://github.com/databricks/spark-sql-perf/blob/master/src/main/scala/com/databricks/spark/sql/perf/bigdata/BigData.scala)
are  empty files.
Is this by intention or this is a bug.
Also,the code snippet as follows in the README.MD won't compile  as there is no Tables class
defined in the org.apache.spark.sql.parquet package:
(I am using Spark1.4.1, is the code compatible with Spark 1.4.1?)

import org.apache.spark.sql.parquet.Tables
// Tables in TPC-DS benchmark used by experiments.
val tables = Tables(sqlContext)
// Setup TPC-DS experiment
val tpcds = new TPCDS (sqlContext = sqlContext)



Mime
View raw message