phoenix-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nathan Davis <>
Subject phoenix-spark error with index on target table
Date Fri, 12 Aug 2016 20:34:06 GMT
Hi All,
I am using phoenix-spark to write a DataFrame to a Phoenix table. All works
fine when just writing to a table alone. However, when I do the same thing
but with a global index on that table I get the following error. The index
is on two columns (varchar, date) with no includes.

A little googling turns up some results about add hbase-protocol.jar to the
hadoop classpath, but I'm not sure which component needs that. I tried
adding that jar to the Spark driver and executor classpath, but still get
the same error. So, I wonder if I need to add this to HBase's classpath...
Does this look familiar to anyone?

java.sql.SQLException: java.util.concurrent.ExecutionException:
> java.lang.Exception: java.lang.IllegalAccessError:
> com/google/protobuf/HBaseZeroCopyByteString
> at
> org.apache.phoenix.cache.ServerCacheClient.addServerCache(
> at
> org.apache.phoenix.index.IndexMetaDataCacheClient.addIndexMetadataCache(
> at
> org.apache.phoenix.execute.MutationState.setMetaDataOnMutations(
> at org.apache.phoenix.execute.MutationState.send(
> at org.apache.phoenix.execute.MutationState.send(
> at org.apache.phoenix.execute.MutationState.commit(
> at
> org.apache.phoenix.jdbc.PhoenixConnection$
> at
> org.apache.phoenix.jdbc.PhoenixConnection$
> at
> at
> org.apache.phoenix.jdbc.PhoenixConnection.commit(
> at
> org.apache.phoenix.mapreduce.PhoenixRecordWriter.write(
> ... 13 more


View raw message