spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jeetendra Gangele <>
Subject Loading already existing tables in spark shell
Date Mon, 24 Aug 2015 12:17:11 GMT
Hi All I have few tables in hive and I wanted to run query against them
with spark as execution engine.

Can I direct;y load these tables in spark shell and run query?

I tried with
1.val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
2.qlContext.sql("FROM event_impressions select count(*)") where
event_impressions is the table name.

It give me error saying "org.apache.spark.sql.AnalysisException: no such
table event_impressions; line 1 pos 5"

Does anybody hit similar issues?


View raw message