httpd-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Rosco Schock" <>
Subject [users@httpd] Caching for concurrent requests
Date Mon, 09 Oct 2006 13:04:05 GMT
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<font size="2"><font face="Arial,sans-serif">Hello all,<br>
I have a question and was wondering if there is a way to configure<br>
Apache and/or some combination of mods to help in the following:<br>
I have a Data Warehouse vendor that has many, many terabytes of<br>
information. The data is organized in zip files with each zip containing<br>
many different individual data files. When a request is received for
the system determines the correct zip file and downloads it. The zip<br>
is then unzipped onto local disk and cached for use by other users.<br>
This part of the system is working as expected. <br>
Many times the user requests are for different data files that just
to be in the same zip file. So, I end up with many requests to my
service for the same file. This is putting a lot of pressure on their
and wasting a lot of bandwidth. Now I know that I can write a server
will proxy and cache all the requests to the vendor and block on
for the same zip file. Unfortunately, I don't have time in the short
term to<br>
write the server, get it through QA and deployed in time.<br>
I can get another server out in the system to act as this proxy/cache
Is there a way to configure Apache (maybe using mod-cache and/or
to not only cache completed requests but also force it block on
ones for the same zip file? Is there a better way that is pretty
out-of-the box<br>
achieve this kind of concurrent caching with Apache?<br>
Thanks for taking the time to read this and respond.<br>
Email me with any questions.<br>

The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:> for more info.
To unsubscribe, e-mail:
   "   from the digest:
For additional commands, e-mail:

View raw message