struts-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Brendan Billingsley <>
Subject Re: [OT] S2 Stream Result, downloading large file, advice needed
Date Fri, 22 Aug 2008 22:23:45 GMT
I have a somewhat similar situation in an application I am creating 
where I needed to provide a .tar.gz file that I create for the user. I 
want that file to hang around for some configurable amount of time in 
case the user re-requests that file and then I wanted to remove it.

I used quartz to periodically run a cleanup method to remove old files. 
Right now it is running once a day to delete any file older then 72 
hours but that can all be changed in my spring configuration later on if 
we have too many files sticking around.

I am extending the StreamResult class to return my tar file for downloaded.

Hope this helps. I can provide more detail into some or all of the above 
if you want. I don't really think that the details of quartz and/or 
spring belong on this list so if you want information about that feel 
free to contact me off list.


Greg Lindholm wrote:
> I'm still interested in hear any suggestions. I know that this is not
> strictly an S2 issue but it is related to using a Stream result type.  If I
> was writing a servlet I would just get the OutputStream and write directly
> to it, putting the burden of buffering etc. onto the container.
> What I decided to do (until I hear a better suggestion) is to write to a
> temporary file (File.createTempFile()) then for the InputStream I created a
> wrapper around FileInputStream which will delete the file when close() is
> called.  This seems to work fine in the conditions I can test, not sure if
> close() will be called if the user hits Cancel in the middle of a very large
> download.
> Greg Lindholm wrote:
>> Hi Folks,  
>> I'm using S2 Stream result type to allow users to download a CSV "file"
>> that I dynamically write from records selected from a database.
>> First pass on this I'm using a StringWriter to write out the CSV data
>> (using SuperCSV) that using the string to build a ByteArrayInputStream for
>> the InputStream result.
>> My concern is that the results could get very large based on the the
>> selection criteria the user supplies. So trying to do this all in-memory
>> with a StringWriter->String->Byte[] is likely a bad idea.
>> I'm looking for advice on a better way of doing this?  
>> I thought I could write to a temp file, but how do I ensure the file is
>> deleted when the download completes?
>> I also thought about using a PipedInputStream and hooking to a
>> PipedOutputStream which I write to in another thread, but this seems like
>> it could become overly complicated having to deal with the threads and
>> database sessions and error conditions etc.
>> Do you know a better way?
>> Any advice or suggestions would be appreciated.
>> Thanks
>> Greg
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail:
>> For additional commands, e-mail:

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message