Hi Yiannis,

Are you able to post your full stack trace? It might be helpful.

I recall one similar incident where Phoenix was doing some casting to BigDecimal, so as a workaround I ran a foreach() on my RDD and attempted the same cast call, and lo and behold I had a NaN record in there.


On Thu, Sep 3, 2015 at 8:32 AM, Yiannis Gkoufas <johngouf85@gmail.com> wrote:
Hi Josh,

thanks for your reply.
The reason that I cannot follow your suggestion is that I am already casting the value in Double in a previous point in my code:

var value = PDouble.INSTANCE.getCodec.decodeDouble(cell.getValueArray, cell.getValueOffset, SortOrder.getDefault)

So I should be getting the error there and not while I am inserting.
Thanks a lot!

On 3 September 2015 at 12:53, Josh Mahonin <jmahonin@interset.com> wrote:
Hi Yiannis,

I've found the best solution to this is generally just to add logging around that area. For example, you could add a try (or Scala Try<>) and check if an exception has been thrown, then log it somewhere.

As a wild guess, if you're dealing with a Double datatype and getting NumberFormatException, is it possible one of your values is a NaN?


On Thu, Sep 3, 2015 at 6:11 AM, Yiannis Gkoufas <johngouf85@gmail.com> wrote:
Hi there,

I am using phoenix-spark to insert multiple entries on a phoenix table.
I get the following errors:

..Exception while committing to database..
..Caused by: java.lang.NumberFormatException..

I couldn't find on the logs what was the row that was causing the issue.
Is it possible to extract somehow the (wrong) values that I am trying to insert in this column of Double type?

Thanks a lot!