/>
Hi all. I notice that large file downloads were handled a few revisions ago -- basically, getting away from the in-memory ASP.NET method provided by MS, to a chunked method. This works great, but there is still a problem uploading. I've tried a variety of settings in httpRuntime for maxRequestLength, executionTimeout and requestLengthDiskThreshold. Right now, settins should allow 500MB maximum size, storing to disk after 256K is buffered, with 2.5 hours before execution timeout.
In web.config:
<
httpRuntime useFullyQualifiedRedirectUrl="true" maxRequestLength="500000" requestLengthDiskThreshold="256" executionTimeout="9000" />
No matter how large I make all this, it fails somewhere in the 30MB to 40MB range -- at that point, you just get the symptoms of a request timeout, even though it is only about 10 minutes. You get the "page unavailalbe" error (even though it progressed through the upload for 10 or 15 minutes). Since this is right around the range where upload fails for many applications on the same server (due to memory limits), it would appear that this is a memory problem -- the stream is being read into memory, and is not being buffered.
The requestLengthDiskThreshold is a .NET 2.0 addition - not in 1.1 -- is it possible the file manager is not .NET 2.0? Or some other reason it would be ignoring this setting in web.config? Or is there some other problem that causes this kind of failure?