trafficserver-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From geo...@free.fr
Subject Re: Cache Inspector Problem for Handling Huge number of Cache Objects
Date Tue, 29 Jul 2014 18:12:28 GMT
Hi,

The problem was encountered earlier but I did no abandonned to solve it. The solution I wrote
using shell/awk is not suitable for multi millions of objects as it needs hours to purge objects
one by one and needs to maintains the list of all objects ousite of Traffic Server.

I am currently trying to write a new plugin that will do like cache-key-genid I tested earlier
but I was not sufficient with the platform I use. My goal is to store the configuration in
a file and keep it in memory (mostly inspired by header_rewrite plugin). This plugin will
also be able to evaluate regex on host and path.
Now I have two possibilities :
1- simply increment an ID to invalidate all the objects matching a rule taking the risk to
increase the number of rules in time
2- use it as a ban list by adding a timestamp in the file (sort of varnish ban list) when
it was last used/called

Maintaining this list can be a problem in time if it increases a lot (more than 500 rules
seems no to be a good thing). Objects could be invalidated and/or simply refreshed when they
are asked from the client because the requested URL is known and then the hash key too...

As I told before I am not a C++ programmer but I will try, I hope LUA will become more and
more strong to easily write new plugins to manipulate objects in cache. I am much more comfortable
with shell/awk scripting than with compiled code and LUA seems to be the good way for me.

Denis

References:
http://mail-archives.apache.org/mod_mbox/trafficserver-users and search "cache inspector alternative"
https://github.com/godaddy/ats-plugin-cache-key-genid


----- Mail original -----
> De: "Leif Hedstrom" <zwoop@apache.org>
> À: users@trafficserver.apache.org
> Envoyé: Mardi 29 Juillet 2014 19:38:22
> Objet: Re: Cache Inspector Problem for Handling Huge number of Cache Objects
> 
> On Jul 29, 2014, at 11:22 AM, avinash katika <
> avinash.katika@gmail.com > wrote:
> 
> Hi...we ran into problem when attempted to use Cache Inspector for
> Regex Lookup & Delete cached objects from an ATS cache VM (single
> object lookup and deletion was OK).

> 
> Following is our set up details:
> 
> Current Version: 4.2.1
> 
> “Pristine is enabled”
> 
> Cache Objects Present: ~1 5 0K
> 
> Cached bytes used: ~50GB
> 
> Sample CI regexp lookup/deletion command when kicked off from “curl”
> (we have titles title_001 to title_250 to mimic 250 x VOD titles
> with ~600 objects per title):
> 
> - curl
> http://<app_id>/ci/lookup_regex?url=http://<sample_origin_domain>/title_199/*
> 
> - curl
> http://<app_id>/ci/delete_regex?url=http://<sample_origin_domain>/title_199/*
> 
> 
> 
> 
> The short answer is, don’t use the cache inspector / regexes to
> manage the cache. It does not scale.
> 
> 
> — leif
> 
> 

Mime
View raw message