phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "kubilay.tsilkara" <kubilay.tsilk...@gmail.com>
Subject Re: Is there a Pentaho connector for Phoenix
Date Tue, 22 Sep 2015 10:56:07 GMT
Hi James

Thank you!

I have also found now a way to connect Pentaho PDI (aka Kettle) to Phoenix.
I used steps from this blog
http://talat.uyarer.com/post/121179803796/how-to-connect-hbase-using-apache-phoenix-from

It works with generic Phoenix drivers, the use case is to create an ETL
transformation from MySQL -> Phoenix to load couple of millions of rows.
The transformation is valid I see the tables on both source and target.

The hurdle I hit now, is that Pentaho source MySQL Sends INSERTs to
Phoenix, whereas Phoenix expects UPSERTs so  I have to do some sort of
transformation in the middle to translate those INSERTs to UPSERTs with a
'decode' like transformation I suppose.

Things look very promising so far.

Cheers.

Kubilay Tsil Kara

Kubilay

On 21 September 2015 at 19:05, James Taylor <jamestaylor@apache.org> wrote:

> Have you seen this blog post, as it details how to connect Phoenix
> to Saiku through Pentaho?
> https://blogs.apache.org/phoenix/entry/olap_with_apache_phoenix_and
>
> HTH. Thanks,
>
> James
>
> On Mon, Sep 21, 2015 at 8:39 AM, kubilay.tsilkara <
> kubilay.tsilkara@gmail.com> wrote:
>
>> I've tried to connect to Phoenix using Pentaho PDI (aka Kettle)
>> <http://community.pentaho.com/projects/data-integration/>with no
>> success.
>>
>> Pentaho is an ETL tool which can do parallel loads to loads of endpoints,
>> including HBase, Hive etc... using JDBC/ODBC connectors. Is there one for
>> Phoenix? Does anybody know?
>>
>> Thank you!
>>
>> Kubilay
>>
>
>

Mime
View raw message