ASP.NET Core 禁用响应缓冲

Pet*_*ano 12 c# asp.net asp.net-web-api asp.net-core asp.net-core-webapi

我正在尝试将动态构建的大型 JSON 文件流式传输到客户端(可能是 500 MB+)。由于各种原因,我试图禁用响应缓冲,但主要是为了提高内存效率。

我试过直接写入,HttpContext.Response.BodyWriter但响应似乎在写入输出之前缓冲在内存中。此方法的返回类型是Task

HttpContext.Response.ContentType = "application/json";
HttpContext.Response.ContentLength = null;
await HttpContext.Response.StartAsync(cancellationToken);
var bodyStream = HttpContext.Response.BodyWriter.AsStream(true);
await bodyStream.WriteAsync(Encoding.UTF8.GetBytes("["), cancellationToken);
await foreach (var item in cursor.WithCancellation(cancellationToken)
    .ConfigureAwait(false))
{
    await bodyStream.WriteAsync(JsonSerializer.SerializeToUtf8Bytes(item, DefaultSettings.JsonSerializerOptions), cancellationToken);
    await bodyStream.WriteAsync(Encoding.UTF8.GetBytes(","), cancellationToken);
    
    await bodyStream.FlushAsync(cancellationToken);
    await Task.Delay(100,cancellationToken);
}
await bodyStream.WriteAsync(Encoding.UTF8.GetBytes("]"), cancellationToken);
bodyStream.Close();
await HttpContext.Response.CompleteAsync().ConfigureAwait(false);
Run Code Online (Sandbox Code Playgroud)

注意:我意识到这段代码很hacky,试图让它工作,然后清理它

我正在使用Task.Delay来验证在本地测试时响应没有被缓冲,因为我没有完整的生产数据。我也试过IAsyncEnumerableand yield return,但失败了,因为响应太大以至于 Kestrel 认为可枚举是无限的。

我试过了

  1. KestrelServerLimits.MaxResponseBufferSize设置为一个较小的数字,甚至 0;
  2. 写作 HttpContext.Response.WriteAsync
  3. 写作 HttpContext.Response.BodyWriter.AsStream()
  4. 用管道作家模式写作和 HttpContext.Response.BodyWriter
  5. 删除所有中间件
  6. 删除呼叫 IApplicationBuilder.UseResponseCompression

更新

  1. 尝试在设置之前禁用响应缓冲ContentType(因此在任何写入响应之前)没有效果
var responseBufferingFeature = context.Features.Get<IHttpResponseBodyFeature>();
responseBufferingFeature?.DisableBuffering();
Run Code Online (Sandbox Code Playgroud)

更新的示例代码

这很简单地重现了这个问题。客户端在response.CompleteAsync()被调用之前不会收到任何数据。

[HttpGet]
[Route("stream")]
public async Task<EmptyResult> FileStream(CancellationToken cancellationToken)
{
    var response = DisableResponseBuffering(HttpContext);
    HttpContext.Response.Headers.Add("Content-Type", "application/gzip");
    HttpContext.Response.Headers.Add("Content-Disposition", $"attachment; filename=\"player-data.csv.gz\"");
    await response.StartAsync().ConfigureAwait(false);
    var memory = response.Writer.GetMemory(1024*1024*10);
    response.Writer.Advance(1024*1024*10);
    await response.Writer.FlushAsync(cancellationToken).ConfigureAwait(false);
    await Task.Delay(5000).ConfigureAwait(false);
    var str2 = Encoding.UTF8.GetBytes("Bar!\r\n");
    memory = response.Writer.GetMemory(str2.Length);
    str2.CopyTo(memory);
    response.Writer.Advance(str2.Length);
    await response.CompleteAsync().ConfigureAwait(false);
    return new EmptyResult();
}

private IHttpResponseBodyFeature DisableResponseBuffering(HttpContext context)
{
    var responseBufferingFeature = context.Features.Get<IHttpResponseBodyFeature>();
    responseBufferingFeature?.DisableBuffering();
    return responseBufferingFeature;
}
Run Code Online (Sandbox Code Playgroud)

Ale*_*yev 1

尝试禁用响应期货的缓冲:

HttpContext.Features.Get<IHttpResponseBodyFeature>().DisableBuffering()
//As mentioned in documentation, to take effect, call it before any writes
Run Code Online (Sandbox Code Playgroud)

并使用BodyWriterUtf8JsonWriter提高效率:

 var pipe = context.HttpContext.Response.BodyWriter;
 await pipe.WriteAsync(startArray);
 using (var writer = new Utf8JsonWriter(pipe,
            new JsonWriterOptions
            {
                Indented = option.WriteIndented,
                Encoder = option.Encoder,
                SkipValidation = true
            }))
 {
      var dotSet = false;
      foreach (var item in enumerable)
      {
           if (dotSet)
               await pipe.WriteAsync(dot);
           JsonSerializer.Serialize(writer, item, itemType, option);
           await pipe.FlushAsync();
           writer.Reset();
           dotSet = true;
      }
 }
 await pipe.WriteAsync(endArray);
Run Code Online (Sandbox Code Playgroud)

就我而言,它给出的结果是:在第一次请求后,与 newcoreapp2.2 相比,总内存分配增加了 80% 以上,但没有更多内存泄漏。