spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Justin Pihony <justin.pih...@gmail.com>
Subject Did DataFrames break basic SQLContext?
Date Wed, 18 Mar 2015 15:20:48 GMT
I started to play with 1.3.0 and found that there are a lot of breaking
changes. Previously, I could do the following:

    case class Foo(x: Int)
    val rdd = sc.parallelize(List(Foo(1)))
    import sqlContext._
    rdd.registerTempTable("foo")

Now, I am not able to directly use my RDD object and have it implicitly
become a DataFrame. It can be used as a DataFrameHolder, of which I could
write:

    rdd.toDF.registerTempTable("foo")

But, that is kind of a pain in comparison. The other problem for me is that
I keep getting a SQLException:

    java.sql.SQLException: Failed to start database 'metastore_db' with
class loader  sun.misc.Launcher$AppClassLoader@10393e97, see the next
exception for details.

This seems to be a dependency on Hive, when previously (1.2.0) there was no
such dependency. I can open tickets for these, but wanted to ask here
first....maybe I am doing something wrong?

Thanks,
Justin



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Did-DataFrames-break-basic-SQLContext-tp22120.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message