subversion-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Cooke, Mark" <>
Subject RE: how large data will affect performance?
Date Tue, 09 Oct 2012 09:21:15 GMT
> -----Original Message-----
> From: Thorsten Schöning [] 
> Sent: 09 October 2012 10:17
> To: ''
> Subject: Re: how large data will affect performance?
> Guten Tag wang.yu,
> am Dienstag, 9. Oktober 2012 um 03:35 schrieben Sie:
> >     I have a SVN server on windows2003.
> >  Now the developer want check in about 10G date to it, and in the
> > last few months, they will check in about 100G data.
> >  Will so many data affect on server’s performance?
> This depends heavily on what the data is used for after the commit. If
> it is only committed once and after that never ever read or updated or
> else, the commit itself will consume resources while processing it,
> but afterwards the version only blocks the space it needs on your
> hard drive and maybe is never accessed again. Things like dump and
> load cycles and such maintenance work on the repo itself is affected,
> of course.

Just a thought but... depending on what the data is, you could consider creating another repository
for that data (and use an svn:external to pull whatever is required into a working copy).
 That might make maintenance of your exist repository(s) easier in the future.

~ mark c
View raw message