hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Suresh Srinivas (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HADOOP-9114) After defined the dfs.checksum.type as the NULL, write file and hflush will through java.lang.ArrayIndexOutOfBoundsException
Date Tue, 04 Dec 2012 03:53:58 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-9114?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13509487#comment-13509487
] 

Suresh Srinivas commented on HADOOP-9114:
-----------------------------------------

Can you also add a test to show the problem you have described?
                
> After defined the dfs.checksum.type as the NULL, write file and hflush will through java.lang.ArrayIndexOutOfBoundsException
> ----------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-9114
>                 URL: https://issues.apache.org/jira/browse/HADOOP-9114
>             Project: Hadoop Common
>          Issue Type: Bug
>    Affects Versions: 2.0.1-alpha
>            Reporter: liuyang
>            Priority: Minor
>         Attachments: FSOutputSummer.java.patch
>
>
> when I test the characteristic parameter about dfs.checksum.type. The value can be defined
as NULL,CRC32C,CRC32. It's ok when the value is CRC32C or CRC32, but the client will through
java.lang.ArrayIndexOutOfBoundsException when the value is configured NULL.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Mime
View raw message