spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Yin Huai <yh...@databricks.com>
Subject Re: Spark 1.4.0-rc4 HiveContext.table("db.tbl") NoSuchTableException
Date Thu, 04 Jun 2015 19:04:08 GMT
Hi Doug,

sqlContext.table does not officially support database name. It only
supports table name as the parameter. We will add a method to support
database name in future.

Thanks,

Yin

On Thu, Jun 4, 2015 at 8:10 AM, Doug Balog <doug.sparkuser@dugos.com> wrote:

> Hi Yin,
>  I’m very surprised to hear that its not supported in 1.3 because I’ve
> been using it since 1.3.0.
> It worked great up until  SPARK-6908 was merged into master.
>
> What is the supported way to get  DF for a table that is not in the
> default database ?
>
> IMHO, If you are not going to support “databaseName.tableName”,
> sqlContext.table() should have a version that takes a database and a table,
> ie
>
> def table(databaseName: String, tableName: String): DataFrame =
>   DataFrame(this, catalog.lookupRelation(Seq(databaseName,tableName)))
>
> The handling of databases in Spark(sqlContext, hiveContext, Catalog) could
> be better.
>
> Thanks,
>
> Doug
>
> > On Jun 3, 2015, at 8:21 PM, Yin Huai <yhuai@databricks.com> wrote:
> >
> > Hi Doug,
> >
> > Actually, sqlContext.table does not support database name in both Spark
> 1.3 and Spark 1.4. We will support it in future version.
> >
> > Thanks,
> >
> > Yin
> >
> >
> >
> > On Wed, Jun 3, 2015 at 10:45 AM, Doug Balog <doug.sparkuser@dugos.com>
> wrote:
> > Hi,
> >
> > sqlContext.table(“db.tbl”) isn’t working for me, I get a
> NoSuchTableException.
> >
> > But I can access the table via
> >
> > sqlContext.sql(“select * from db.tbl”)
> >
> > So I know it has the table info from the metastore.
> >
> > Anyone else see this ?
> >
> > I’ll keep digging.
> > I compiled via make-distribution  -Pyarn -phadoop-2.4 -Phive
> -Phive-thriftserver
> > It worked for me in 1.3.1
> >
> > Cheers,
> >
> > Doug
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> > For additional commands, e-mail: user-help@spark.apache.org
> >
> >
>
>

Mime
View raw message