libcloud-notifications mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "ASF GitHub Bot (JIRA)" <>
Subject [jira] [Commented] (LIBCLOUD-711) Periodic GZIP CRC check failure
Date Wed, 13 May 2015 19:09:00 GMT


ASF GitHub Bot commented on LIBCLOUD-711:

GitHub user chrisob opened a pull request:

    [LIBCLOUD-711] Fixed occasional CRC check failure when decompressing …

    …large responses
    fixes issue:

You can merge this pull request into a Git repository by running:

    $ git pull LIBCLOUD-711_gzip_crc_check_fail_fix

Alternatively you can review and apply these changes as the patch at:

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #519
commit e931f2b9fd19a1d2b16e25ee44c96acb768371dc
Author: Chris O'Brien <>
Date:   2015-05-13T19:07:50Z

    [LIBCLOUD-711] Fixed occasional CRC check failure when decompressing large responses


> Periodic GZIP CRC check failure
> -------------------------------
>                 Key: LIBCLOUD-711
>                 URL:
>             Project: Libcloud
>          Issue Type: Bug
>         Environment: Python 2.6.6
>            Reporter: Chris O'Brien
> When attempting to parse a gzipped server response, occasionally a CRC check fails while
decompressing the response body.  Most of the time the response is correctly decompressed
and parsed.  
> Although the compressed data is complete (verified by writing data to file and gunzipping
it), this issue only seems to happen with chunked responses.  
> I believe this is due to the fact that the response body is unusually large (~43K uncompressed).
> I'm using the CloudSigma driver, and this specifically happens with the list_nodes method
(the one which responds with the largest amount of data):
> {noformat}
> IOError: CRC check failed 0xdebd5ac != 0x42c31c02L
> ...
> File "/home/dbs_support/dev/libcloudsigma/", line 298, in _get_node
>   nodes = self.cloud_driver.list_nodes()
> File "/home/dbs_support/lib/libcloud/compute/drivers/", line 1025, in list_nodes
>   response = self.connection.request(action=action, method='GET').object
> File "/home/dbs_support/lib/libcloud/compute/drivers/", line 965, in request
>   raw=raw)
> File "/home/dbs_support/lib/libcloud/common/", line 750, in request
>   'response': self.connection.getresponse()}
> File "/home/dbs_support/lib/libcloud/common/", line 404, in getresponse
>   r, rv = self._log_response(r)
> File "/home/dbs_support/lib/libcloud/common/", line 311, in _log_response
>   body = decompress_data('gzip', body)
> File "/home/dbs_support/lib/libcloud/utils/", line 39, in decompress_data
>   return gzip.GzipFile(fileobj=cls(data)).read()
> File "/usr/lib64/python2.6/", line 212, in read
>   self._read(readsize)
> File "/usr/lib64/python2.6/", line 267, in _read
>   self._read_eof()
> File "/usr/lib64/python2.6/", line 304, in _read_eof
>   hex(self.crc)))
> {noformat}
> In utils/, would it be wise to replace line #39 ({{return gzip.GzipFile(fileobj=cls(data)).read()}})
to use zlib?  This seems to fix the problem for me, but I'm unaware if there are any negative
> For example:
> {noformat}
>         decomp = zlib.decompressobj(16+zlib.MAX_WBITS)
>         return decomp.decompress(data)
> {noformat}

This message was sent by Atlassian JIRA

View raw message