spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ayan guha <guha.a...@gmail.com>
Subject Re: Writing to Hbase table from Spark
Date Tue, 30 Aug 2016 09:23:36 GMT
You can use rdd level new hadoop format api and pass on appropriate
classes.
On 30 Aug 2016 19:13, "Mich Talebzadeh" <mich.talebzadeh@gmail.com> wrote:

> Hi,
>
> Is there an existing interface to read from and write to Hbase table in
> Spark.
>
> Similar to below for Parquet
>
> val s = spark.read.parquet("oraclehadoop.sales2")
> s.write.mode("overwrite").parquet("oraclehadoop.sales4")
>
> Or need too write Hive table which is already defined over Hbase?
>
>
> Thanks
>
>
>
>
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>

Mime
View raw message