ASP.NETCore Output Folder Content as Compressed Package File Method

Keywords: ASP.NET network

This article mainly tells you a way to save memory, output the contents of the entire folder as a compressed package, but in fact there is not so much memory to apply for, nor need to upgrade to create a compressed package file.The idea is to read the files one by one and output them in a compressed package format

The method on each request gets the HttpContext property, which gets the Response property, where you can use the BodyWriter property, where the content written will be downloaded by the client

This property can be used as a Stream, see the code below

     using var stream = HttpContext.Response.BodyWriter.AsStream();

In.NET, you can write a folder's files in a compressed file format through ZipArchive, set the compression ratio, etc., and set the path of the folder where the files are located

By creating a ZipArchive class in this stream and creating files in it, you can continuously send files to clients, all in a compressed package

        /// <summary>
        ///Read the contents of a folder as Stream's compressed package
        /// </summary>
        /// <param name="directory"></param>
        /// <param name="stream"></param>
        public static async Task ReadDirectoryToZipStreamAsync(DirectoryInfo directory, Stream stream)
        {
            var fileList = directory.GetFiles();

            using var zipArchive = new ZipArchive(stream, ZipArchiveMode.Create);
            foreach (var file in fileList)
            {
                var relativePath = file.FullName.Replace(directory.FullName, "");
                if (relativePath.StartsWith("\\") || relativePath.StartsWith("//"))
                {
                    relativePath = relativePath.Substring(1);
                }

                var zipArchiveEntry = zipArchive.CreateEntry(relativePath, CompressionLevel.NoCompression);

                using (var entryStream = zipArchiveEntry.Open())
                {
                    using var toZipStream = file.OpenRead();
                    await toZipStream.CopyToAsync(stream);
                }

                await stream.FlushAsync();
            }
        }

The code above allows running programs to read and upload local files without requiring as much memory space as they need to transfer, or without first performing compression on the local files.Reading local files, and so on, automatically sets the cache size through CopyToAsync.If you are not confident about the size of the cache set by the CopyToAsync method, you can manually set the size of the cache by overloading it

      await toZipStream.CopyToAsync(stream, bufferSize: 100);

The above code sets the file not to be compressed, because as a file transfer, my business is actually transferring on the intranet. My disk read speed is about 20M per second, while network transfer is 10M per second. Compression at this time is not really meaningful. Compression reduces the transmission time by almost the same amount of time as compression

If your partner needs to compress when transferring, setZipArchive.CreateEntryMethod

Of course, the disadvantage of this method is that the server may have read the file itself while transferring, which will transfer the wrong file, and the client does not know that the server is transferring the wrong file, because the compressed size does not tell the client.If you want to tell the client the compressed size, you need to compress on the server side first.This method is set up without compression, and the approximate size can also be communicated to the user.

How can this method be used?Response properties can be passed in through the HttpContext in any Get method

StatusCode value needs to be set before writing using BodyWriter

            HttpContext.Response.StatusCode = StatusCodes.Status200OK;

            using var stream = HttpContext.Response.BodyWriter.AsStream();

Suppose the folder you want to return is f:\lindexi\test\You can output the folder as a zipped package in the following code

        [HttpGet]
        [Route("{id}")]
        public async Task Get([FromRoute] string id)
        {
            var folder = @"f:\lindexi\test\";
            HttpContext.Response.StatusCode = StatusCodes.Status200OK;

            using var stream = HttpContext.Response.BodyWriter.AsStream();

            await ReadDirectoryToZipStreamAsync(new DirectoryInfo(folder), stream);
        }

Locally I wrote a PowerShell script to run

For ($i=0; $i -le 100000; $i++) 
{
 (new-object System.Net.WebClient).DownloadFile("http://localhost:5000/File/doubi", "F:\lindexi\zip\2.zip")
} 

Run this script locally to see that there is actually no GC or overflow of memory. I run to see that the memory is about 100M

Getting takes up some CPU resources, but saves memory

If your little buddy has a better way, please let me know

Posted by dearmoawiz on Mon, 15 Jun 2020 18:14:04 -0700