phoenix-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "suprita (JIRA)" <>
Subject [jira] [Commented] (PHOENIX-4234) Unable to find failed csv records in phoenix logs
Date Wed, 27 Sep 2017 05:14:00 GMT


suprita commented on PHOENIX-4234:

Hi ankit,

I am still not able to find the error in the yarn logs.

I am running the command:
hadoop jar /opt/cloudera/parcels/CLABS_PHOENIX-4.5.2-1.clabs_phoenix1.2.0.p0.774/lib/phoenix/phoenix-1.2.0-client.jar
org.apache.phoenix.mapreduce.CsvBulkLoadTool -Dfs.permissions.umask-mode=000 --table G1V3IN_SEPT
--input /user/mi841425/smallgstr1.csv -ignore-errors

Where we have 3 records with error which are not being inserted into hbase because of ignore
errors parameter which is fine.

But we need to track those 3 records as well.

Please help with it.

Suprita Bothra

> Unable to find failed csv records in phoenix logs
> -------------------------------------------------
>                 Key: PHOENIX-4234
>                 URL:
>             Project: Phoenix
>          Issue Type: Bug
>            Reporter: suprita bothra
> Unable to fetch missing records information in phoenix table.How can we fetch the missing
records info.
> Like while parsing csv into hbase via bulkloading via mapreduce,and using --igonre-errors
 option to parse csv.
> So csv records having error are skipped but we are unable to fetch the info of records
which are skipped/failed and dint go into table.
> There must be logs of such information .Please help in identifying if we can get logs
of failed records

This message was sent by Atlassian JIRA

View raw message