phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "kubilay.tsilkara" <>
Subject Re: Is there a Pentaho connector for Phoenix
Date Tue, 22 Sep 2015 10:56:07 GMT
Hi James

Thank you!

I have also found now a way to connect Pentaho PDI (aka Kettle) to Phoenix.
I used steps from this blog

It works with generic Phoenix drivers, the use case is to create an ETL
transformation from MySQL -> Phoenix to load couple of millions of rows.
The transformation is valid I see the tables on both source and target.

The hurdle I hit now, is that Pentaho source MySQL Sends INSERTs to
Phoenix, whereas Phoenix expects UPSERTs so  I have to do some sort of
transformation in the middle to translate those INSERTs to UPSERTs with a
'decode' like transformation I suppose.

Things look very promising so far.


Kubilay Tsil Kara


On 21 September 2015 at 19:05, James Taylor <> wrote:

> Have you seen this blog post, as it details how to connect Phoenix
> to Saiku through Pentaho?
> HTH. Thanks,
> James
> On Mon, Sep 21, 2015 at 8:39 AM, kubilay.tsilkara <
>> wrote:
>> I've tried to connect to Phoenix using Pentaho PDI (aka Kettle)
>> <>with no
>> success.
>> Pentaho is an ETL tool which can do parallel loads to loads of endpoints,
>> including HBase, Hive etc... using JDBC/ODBC connectors. Is there one for
>> Phoenix? Does anybody know?
>> Thank you!
>> Kubilay

View raw message