Thanks a lot Ben. That helped.

On Thu, Aug 8, 2013 at 3:12 AM, Ben Reser <> wrote:
On Wed, Aug 7, 2013 at 1:59 PM, Akash Jain <> wrote:
> Per Akamai Guy, Vary shows akamai that content can vary so akamai is not
> caching, and this leading akamai to make requests to our webversion ...
> We mostly just use JS and CSS to be served from akamai ..

I think whoever you're talking about at Akamai isn't being very
helpful.  I know at a minimum you can simply not use compression
between you and Akamai and then turn on content-acceleration and Akmai
will do the compression for you.  But I'm pretty sure they can also
support compression from the origin as well.

Using a random css file from Godady's website:

If I do the following with and without the --compressed I see that the
file is cached:
$ curl -H 'Pragma: akamai-x-cache-on, akamai-x-get-cache-key,
akamai-x-get-true-cache-key, akamai-x-serial-no' -v -o /dev/null
(note the X-Cache response with TCP_MEM_HIT).

Using the X-Cache-Key header you can find the origin server which is in this case...

Hitting it like so:
$ curl --compressed -v -o /dev/null

I see that they are using Content-Encoding: gzip and Vary: Accept-Encoding.

I'm not sure if there's some config they have on their side to avoid
Akamai request compression or for their origin server to refuse to
give Akamai gzip.  Unfortunately I don't have an Akamai setup anymore
to play with.

Thing is Akamai benefits from properly supporting this because their
bandwidth bill to retrieve data from the origin server goes down.

To unsubscribe, e-mail:
For additional commands, e-mail: