spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Justin Pihony <justin.pih...@gmail.com>
Subject Re: Did DataFrames break basic SQLContext?
Date Wed, 18 Mar 2015 15:43:26 GMT
It appears that the metastore_db problem is related to
https://issues.apache.org/jira/browse/SPARK-4758. I had another shell open
that was stuck. This is probably a bug, though?

    import sqlContext.implicits
    case class Foo(x: Int)
    val rdd = sc.parallelize(List(Foo(1)))
    rdd.toDF

results in a frozen shell after this line:

    INFO MetaStoreDirectSql: MySQL check failed, assuming we are not on
mysql: Lexical error at line 1, column 5.  Encountered: "@" (64), after :
"".

which, locks the internally created metastore_db


On Wed, Mar 18, 2015 at 11:20 AM, Justin Pihony <justin.pihony@gmail.com>
wrote:

> I started to play with 1.3.0 and found that there are a lot of breaking
> changes. Previously, I could do the following:
>
>     case class Foo(x: Int)
>     val rdd = sc.parallelize(List(Foo(1)))
>     import sqlContext._
>     rdd.registerTempTable("foo")
>
> Now, I am not able to directly use my RDD object and have it implicitly
> become a DataFrame. It can be used as a DataFrameHolder, of which I could
> write:
>
>     rdd.toDF.registerTempTable("foo")
>
> But, that is kind of a pain in comparison. The other problem for me is that
> I keep getting a SQLException:
>
>     java.sql.SQLException: Failed to start database 'metastore_db' with
> class loader  sun.misc.Launcher$AppClassLoader@10393e97, see the next
> exception for details.
>
> This seems to be a dependency on Hive, when previously (1.2.0) there was no
> such dependency. I can open tickets for these, but wanted to ask here
> first....maybe I am doing something wrong?
>
> Thanks,
> Justin
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Did-DataFrames-break-basic-SQLContext-tp22120.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Mime
View raw message