如何从表单上传 2GB+ 的大文件到 .NET Core API 控制器?

Rom*_*san 6 .net c# core azure azure-storage-blobs

在通过 Postman 上传大文件时(从前端用 php 编写的表单,我遇到了同样的问题)我从 Azure Web 应用程序收到了 502 错误网关错误消息:

502 - Web 服务器在充当网关或代理服务器时收到无效响应。您要查找的页面有问题,无法显示。当 Web 服务器(作为网关或代理)联系上游内容服务器时,它收到来自内容服务器的无效响应。

我在 Azure 应用程序洞察中看到的错误:

Microsoft.AspNetCore.Connections.ConnectionResetException:客户端已断开连接 <--- 尝试对不存在的网络连接进行操作。(来自 HRESULT 的异常:0x800704CD)

尝试上传 2GB 测试文件时发生这种情况。使用 1GB 文件它可以正常工作,但它需要工作到大约 5GB。

我使用块写入方法优化了将文件流写入 azure blob 存储的部分(归功于:https : //www.red-gate.com/simple-talk/cloud/platform-as-a-service /azure-blob-storage-part-4-uploading-large-blobs/)但对我来说看起来连接正在关闭到客户端(在这种情况下是邮递员),因为这似乎是单个 HTTP POST 请求并且底层 Azure 网络堆栈(例如负载平衡器)正在关闭连接,因为它需要很长时间,直到我的 API 为 HTTP POST 请求提供 HTTP 200 OK。

我的假设正确吗?如果是,如何实现从我的前端(或邮递员)上传的数据块(例如 15MB)然后可以通过 API 以比整个 2GB 更快的方式确认?即使创建一个 SAS URL 用于上传到 azure blob 并将 URL 返回到浏览器也可以,但不确定我如何轻松集成 - 还有最大块大小 afaik,所以对于 2GB,我可能需要创建多个块。如果这是建议,那么在这里获得一个好的样品会很棒,但也欢迎其他想法!

这是我在 C# .Net Core 2.2 中的 API 控制器端点中的相关部分:

        [AllowAnonymous]
            [HttpPost("DoPost")]
            public async Task<IActionResult> InsertFile([FromForm]List<IFormFile> files, [FromForm]string msgTxt)
            {
                 ...

                        // use generated container name
                        CloudBlobContainer container = blobClient.GetContainerReference(SqlInsertId);

                        // create container within blob
                        if (await container.CreateIfNotExistsAsync())
                        {
                            await container.SetPermissionsAsync(
                                new BlobContainerPermissions
                                {
                                    // PublicAccess = BlobContainerPublicAccessType.Blob
                                    PublicAccess = BlobContainerPublicAccessType.Off
                                }
                                );
                        }

                        // loop through all files for upload
                        foreach (var asset in files)
                        {
                            if (asset.Length > 0)
                            {

                                // replace invalid chars in filename
                                CleanFileName = String.Empty;
                                CleanFileName = Utils.ReplaceInvalidChars(asset.FileName);

                                // get name and upload file
                                CloudBlockBlob blockBlob = container.GetBlockBlobReference(CleanFileName);


                                // START of block write approach

                                //int blockSize = 256 * 1024; //256 kb
                                //int blockSize = 4096 * 1024; //4MB
                                int blockSize = 15360 * 1024; //15MB

                                using (Stream inputStream = asset.OpenReadStream())
                                {
                                    long fileSize = inputStream.Length;

                                    //block count is the number of blocks + 1 for the last one
                                    int blockCount = (int)((float)fileSize / (float)blockSize) + 1;

                                    //List of block ids; the blocks will be committed in the order of this list 
                                    List<string> blockIDs = new List<string>();

                                    //starting block number - 1
                                    int blockNumber = 0;

                                    try
                                    {
                                        int bytesRead = 0; //number of bytes read so far
                                        long bytesLeft = fileSize; //number of bytes left to read and upload

                                        //do until all of the bytes are uploaded
                                        while (bytesLeft > 0)
                                        {
                                            blockNumber++;
                                            int bytesToRead;
                                            if (bytesLeft >= blockSize)
                                            {
                                                //more than one block left, so put up another whole block
                                                bytesToRead = blockSize;
                                            }
                                            else
                                            {
                                                //less than one block left, read the rest of it
                                                bytesToRead = (int)bytesLeft;
                                            }

                                            //create a blockID from the block number, add it to the block ID list
                                            //the block ID is a base64 string
                                            string blockId =
                                              Convert.ToBase64String(ASCIIEncoding.ASCII.GetBytes(string.Format("BlockId{0}",
                                                blockNumber.ToString("0000000"))));
                                            blockIDs.Add(blockId);
                                            //set up new buffer with the right size, and read that many bytes into it 
                                            byte[] bytes = new byte[bytesToRead];
                                            inputStream.Read(bytes, 0, bytesToRead);

                                            //calculate the MD5 hash of the byte array
                                            string blockHash = Utils.GetMD5HashFromStream(bytes);

                                            //upload the block, provide the hash so Azure can verify it
                                            blockBlob.PutBlock(blockId, new MemoryStream(bytes), blockHash);

                                            //increment/decrement counters
                                            bytesRead += bytesToRead;
                                            bytesLeft -= bytesToRead;
                                        }

                                        //commit the blocks
                                        blockBlob.PutBlockList(blockIDs);

                                    }
                                    catch (Exception ex)
                                    {
                                        System.Diagnostics.Debug.Print("Exception thrown = {0}", ex);
                                        // return BadRequest(ex.StackTrace);
                                    }
                                }

                                // END of block write approach
...
Run Code Online (Sandbox Code Playgroud)

这是通过 Postman 的 HTTP POST 示例:

邮递员图片

我在 web.config 中设置了 maxAllowedContentLength & requestTimeout 进行测试:

requestLimits maxAllowedContentLength="4294967295"

aspNetCore processPath="%LAUNCHER_PATH%" arguments="%LAUNCHER_ARGS%" stdoutLogEnabled="false" stdoutLogFile=".\logs\stdout" requestTimeout="00:59:59" hostingModel="InProcess"

Sta*_*ong 5

如果您想将大型 blob 文件上传到 Azure 存储,从后端获取 SAS 令牌并直接从客户端上传此文件将是更好的选择,我认为它不会增加您的后端工作负载。您可以使用下面的代码为您的客户端获取具有 2 小时写入权限的 SAS 令牌:

    var containerName = "<container name>";
    var accountName = "<storage account name>";
    var key = "<storage account key>";
    var cred = new StorageCredentials(accountName, key);
    var account = new CloudStorageAccount(cred,true);
    var container = account.CreateCloudBlobClient().GetContainerReference(containerName);

    var writeOnlyPolicy = new SharedAccessBlobPolicy() { 
        SharedAccessStartTime = DateTime.Now,
        SharedAccessExpiryTime = DateTime.Now.AddHours(2),
        Permissions = SharedAccessBlobPermissions.Write
    };

    var sas = container.GetSharedAccessSignature(writeOnlyPolicy);
Run Code Online (Sandbox Code Playgroud)

获得此 sas token 后,您可以使用它通过客户端的存储 JS SDK上传文件。这是一个 html 示例:

<!DOCTYPE html> 
<html> 
<head> 
    <title> 
        upload demo
    </title> 

    <script src= 
"https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js"> 
    </script> 


    <script src= "./azure-storage-blob.min.js"> </script> 
</head> 

<body> 
    <div align="center"> 
        <form method="post" action="" enctype="multipart/form-data"
                id="myform"> 

            <div > 
                <input type="file" id="file" name="file" /> 
                <input type="button" class="button" value="Upload"
                        id="but_upload"> 
            </div> 
        </form> 
        <div id="status"></div>


    </div>   

    <script type="text/javascript"> 
        $(document).ready(function() { 


            var sasToken = '?sv=2018-11-09&sr=c&sig=XXXXXXXXXXXXXXXXXXXXXXXXXOuqHSrH0Fo%3D&st=2020-01-27T03%3A58%3A20Z&se=2020-01-28T03%3A58%3A20Z&sp=w'
            var containerURL = 'https://stanstroage.blob.core.windows.net/container1/'


            $("#but_upload").click(function() { 

                var file = $('#file')[0].files[0]; 
                const container = new azblob.ContainerURL(containerURL + sasToken, azblob.StorageURL.newPipeline(new azblob.AnonymousCredential));
                try {
                    $("#status").wrapInner("uploading .... pls wait");


                    const blockBlobURL = azblob.BlockBlobURL.fromContainerURL(container, file.name);
                    var result  = azblob.uploadBrowserDataToBlockBlob(
                            azblob.Aborter.none, file, blockBlobURL);

                    result.then(function(result) {
                        document.getElementById("status").innerHTML = "Done"
                        }, function(err) {
                            document.getElementById("status").innerHTML = "Error"
                            console.log(err); 
                        });


                } catch (error) {
                    console.log(error);
                }


            });
        }); 
    </script> 
</body> 

</html> 
Run Code Online (Sandbox Code Playgroud)

我上传了 20 分钟的 3.6GB .zip 文件,它对我来说非常适合,sdk 将打开多个线程并逐部分上传您的大文件: 在此输入图像描述 在此输入图像描述 在此输入图像描述

注意:在这种情况下,请确保您的存储帐户已启用 CORS,以便 statc html 可以将请求发布到 Azure 存储服务。

希望能帮助到你。