spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bobby Chowdary <bobby.chowdar...@gmail.com>
Subject Re: [VOTE] Release Apache Spark 1.4.0 (RC4)
Date Fri, 05 Jun 2015 21:41:23 GMT
Thanks Yin !

every thing else works great!

+1 (non-binding)

On Fri, Jun 5, 2015 at 2:11 PM, Yin Huai <yhuai@databricks.com> wrote:

> Hi Bobby,
>
> sqlContext.table("test.test1") is not officially supported in 1.3. For
> now, please use the "use database" as a workaround. We will add it.
>
> Thanks,
>
> Yin
>
> On Fri, Jun 5, 2015 at 12:18 PM, Bobby Chowdary <
> bobby.chowdary03@gmail.com> wrote:
>
>> Not sure if its a blocker but there might be a minor issue with hive
>> context, there is also a work around
>>
>> *Works:*
>>
>> from pyspark.sql import HiveContext
>>
>> sqlContext = HiveContext(sc)
>> df = sqlContext.sql("select * from test.test1")
>>
>> *Does not Work:*
>>
>>  df = sqlContext.table("test.test1")
>>
>> Py4JJavaError: An error occurred while calling o260.table. : org.apache.spark.sql.catalyst.analysis.NoSuchTableException
    at org.apache.spark.sql.hive.client.ClientInterface$anonfun$getTable$1.apply(ClientInterface.scala:112)
    at org.apache.spark.sql.hive.client.ClientInterface$anonfun$getTable$1.apply(ClientInterface.scala:112)
    at scala.Option.getOrElse(Option.scala:120)     at org.apache.spark.sql.hive.client.ClientInterface$class.getTable(ClientInterface.scala:112)
    at org.apache.spark.sql.hive.client.ClientWrapper.getTable(ClientWrapper.scala:58)   
 at org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation(HiveMetastoreCatalog.scala:227)
    at org.apache.spark.sql.hive.HiveContext$anon$2.org$apache$spark$sql$catalyst$analysis$OverrideCatalog$super$lookupRelation(HiveContext.scala:370)
    at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$anonfun$lookupRelation$3.apply(Catalog.scala:165)
    at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$anonfun$lookupRelation$3.apply(Catalog.scala:165)
    at scala.Option.getOrElse(Option.scala:120)     at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.lookupRelation(Catalog.scala:165)
    at org.apache.spark.sql.hive.HiveContext$anon$2.lookupRelation(HiveContext.scala:370)
    at org.apache.spark.sql.SQLContext.table(SQLContext.scala:754)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)     at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)     at py4j.Gateway.invoke(Gateway.java:259)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)     at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:207)     at java.lang.Thread.run(Thread.java:745)
 (<class 'py4j.protocol.Py4JJavaError'>, Py4JJavaError(u'An error occurred while calling
o260.table.\n', JavaObject id=o262), <traceback object at 0x2e248c0>)
>>
>> How ever which i swtich db context it works
>>
>> *Works:*
>>
>>  sqlContext.sql("use test")
>>  df = sqlContext.table("test1")
>>
>> Bulit on Mac OSX  JDK6for Mapr Distribution and Running on CentOS 7.0 JDK8
>>
>> make-distribution.sh --tgz -Pmapr4  -Phive -Pnetlib-lgpl -Phive-thriftserver
>>
>> didn’t have this issue in RC3 and tried it on scala as well.
>>
>> Thanks
>> Bobby
>> ​
>>
>
>

Mime
View raw message