libcloud-notifications mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Chris O'Brien (JIRA)" <>
Subject [jira] [Created] (LIBCLOUD-711) Periodic GZIP CRC check failure
Date Wed, 13 May 2015 19:05:01 GMT
Chris O'Brien created LIBCLOUD-711:

             Summary: Periodic GZIP CRC check failure
                 Key: LIBCLOUD-711
             Project: Libcloud
          Issue Type: Bug
         Environment: Python 2.6.6
            Reporter: Chris O'Brien

When attempting to parse a gzipped server response, occasionally a CRC check fails while decompressing
the response body.  Most of the time the response is correctly decompressed and parsed.  

Although the compressed data is complete (verified by writing data to file and gunzipping
it), this issue only seems to happen with chunked responses.  

I believe this is due to the fact that the response body is unusually large (~43K uncompressed).

I'm using the CloudSigma driver, and this specifically happens with the list_nodes method
(the one which responds with the largest amount of data):
IOError: CRC check failed 0xdebd5ac != 0x42c31c02L
File "/home/dbs_support/dev/libcloudsigma/", line 298, in _get_node
  nodes = self.cloud_driver.list_nodes()
File "/home/dbs_support/lib/libcloud/compute/drivers/", line 1025, in list_nodes
  response = self.connection.request(action=action, method='GET').object
File "/home/dbs_support/lib/libcloud/compute/drivers/", line 965, in request
File "/home/dbs_support/lib/libcloud/common/", line 750, in request
  'response': self.connection.getresponse()}
File "/home/dbs_support/lib/libcloud/common/", line 404, in getresponse
  r, rv = self._log_response(r)
File "/home/dbs_support/lib/libcloud/common/", line 311, in _log_response
  body = decompress_data('gzip', body)
File "/home/dbs_support/lib/libcloud/utils/", line 39, in decompress_data
  return gzip.GzipFile(fileobj=cls(data)).read()
File "/usr/lib64/python2.6/", line 212, in read
File "/usr/lib64/python2.6/", line 267, in _read
File "/usr/lib64/python2.6/", line 304, in _read_eof

In utils/, would it be wise to replace line #39 ({{return gzip.GzipFile(fileobj=cls(data)).read()}})
to use zlib?  This seems to fix the problem for me, but I'm unaware if there are any negative

For example:
        decomp = zlib.decompressobj(16+zlib.MAX_WBITS)
        return decomp.decompress(data)

This message was sent by Atlassian JIRA

View raw message