spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marco Colombo <ing.marco.colo...@gmail.com>
Subject Re: HiveThriftServer2.startWithContext no more showing tables in 1.6.2
Date Thu, 21 Jul 2016 14:30:00 GMT
Thanks.

That is just a typo. I'm using on 'spark://10.0.2.15:7077' (standalone).
Same url used in --master in spark-submit



2016-07-21 16:08 GMT+02:00 Mich Talebzadeh <mich.talebzadeh@gmail.com>:

> Hi Marco
>
> In your code
>
> val conf = new SparkConf()
>       .setMaster("spark://10.0.2.15:7077")
>       .setMaster("local")
>       .set("spark.cassandra.connection.host", "10.0.2.15")
>       .setAppName("spark-sql-dataexample");
>
> As I understand the first .setMaster("spark://<IP_ADDRESS>:7077 indicates
> that you are using Spark in standalone mode and then .setMaster("local")
> means you are using it in Local mode?
>
> Any reason for it?
>
> Basically you are overriding standalone with local.
>
> HTH
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
> On 21 July 2016 at 14:55, Marco Colombo <ing.marco.colombo@gmail.com>
> wrote:
>
>> Hi all, I have a spark application that was working in 1.5.2, but now has
>> a problem in 1.6.2.
>>
>> Here is an example:
>>
>>     val conf = new SparkConf()
>>       .setMaster("spark://10.0.2.15:7077")
>>       .setMaster("local")
>>       .set("spark.cassandra.connection.host", "10.0.2.15")
>>       .setAppName("spark-sql-dataexample");
>>
>>     val hiveSqlContext = new HiveContext(SparkContext.getOrCreate(conf));
>>
>>     //Registering tables....
>>     var query = """OBJ_TAB""".stripMargin;
>>
>>     val options = Map(
>>       "driver" -> "org.postgresql.Driver",
>>       "url" -> "jdbc:postgresql://127.0.0.1:5432/DB",
>>       "user" -> "postgres",
>>       "password" -> "postgres",
>>       "dbtable" -> query);
>>
>>     import hiveSqlContext.implicits._;
>>     val df: DataFrame =
>> hiveSqlContext.read.format("jdbc").options(options).load();
>>     df.registerTempTable("V_OBJECTS");
>>
>>      val optionsC = Map("table"->"data_tab", "keyspace"->"data");
>>     val stats : DataFrame =
>> hiveSqlContext.read.format("org.apache.spark.sql.cassandra").options(optionsC).load();
>>     //stats.foreach { x => println(x) }
>>     stats.registerTempTable("V_DATA");
>>
>>     //START HIVE SERVER
>>     HiveThriftServer2.startWithContext(hiveSqlContext);
>>
>> Now, from app I can perform queries and joins over the 2 registered
>> table, but if I connect to port 10000 via beeline, I see no registered
>> tables.
>> show tables is empty.
>>
>> I'm using embedded DERBY DB, but this was working in 1.5.2.
>>
>> Any suggestion?
>>
>> Thanks!!!!
>>
>>
>


-- 
Ing. Marco Colombo

Mime
View raw message