[英]Strategy and how to create Request and Response pipes in .net core 3.0 middleware
I am developing a .net core middle-ware (api) and thinking to use pipes with following flow, Can someone tell me is this is a good approach and comply best practices or should i use different strategy. 我正在开发一个.net核心中间件(api)并且考虑使用具有以下流程的管道,有人能告诉我这是一个好的方法并遵守最佳实践或者我应该使用不同的策略。
I know that we can read stream only time (point 3) but i figured this out already and after reading i have attach it to request stream again. 我知道我们只能读取流时间(第3点),但我已经知道了这一点,并且在阅读之后我再次将它附加到请求流。
So, confusion is where to write the response? 那么,混淆是在哪里写回复? In api? 在api? or in separate pipe. 或在单独的管道中。
If i do it in separate pipe then i am handling my response two time (one is creating response in api, second is reading response in separate pipe) which is a performance hit. 如果我在单独的管道中执行它,那么我正在处理我的响应两次(一个是在api中创建响应,第二个是在单独的管道中读取响应)这是一个性能命中。
Can i pass the data from point number 4 to 5 like from api to my pipe and from there that response should added into response stream and if it is correct then how can i pass the data from api to pipe? 我可以将数据从第4点到第5点传递,例如从api到我的管道,并从那里将响应添加到响应流中,如果它是正确的,那么我如何将数据从api传递给管道?
Yes, response stream can only be read once. 是的,响应流只能读取一次。 You can use the MemoryStream
to load the response , reference article : 您可以使用MemoryStream
加载响应, 参考文章 :
First, read the request and format it into a string. 首先,读取请求并将其格式化为字符串。
Next, create a dummy MemoryStream to load the new response into. 接下来,创建一个虚拟MemoryStream以将新响应加载到。
Then, wait for the server to return a response. 然后,等待服务器返回响应。
Code sample : 代码示例:
public class RequestResponseLoggingMiddleware
{
private readonly RequestDelegate _next;
public RequestResponseLoggingMiddleware(RequestDelegate next)
{
_next = next;
}
public async Task Invoke(HttpContext context)
{
//First, get the incoming request
var request = await FormatRequest(context.Request);
//Copy a pointer to the original response body stream
var originalBodyStream = context.Response.Body;
//Create a new memory stream...
using (var responseBody = new MemoryStream())
{
//...and use that for the temporary response body
context.Response.Body = responseBody;
//Continue down the Middleware pipeline, eventually returning to this class
await _next(context);
//Format the response from the server
var response = await FormatResponse(context.Response);
//TODO: Save log to chosen datastore
//Copy the contents of the new memory stream (which contains the response) to the original stream, which is then returned to the client.
await responseBody.CopyToAsync(originalBodyStream);
}
}
private async Task<string> FormatRequest(HttpRequest request)
{
var body = request.Body;
//This line allows us to set the reader for the request back at the beginning of its stream.
request.EnableRewind();
//We now need to read the request stream. First, we create a new byte[] with the same length as the request stream...
var buffer = new byte[Convert.ToInt32(request.ContentLength)];
//...Then we copy the entire request stream into the new buffer.
await request.Body.ReadAsync(buffer, 0, buffer.Length);
//We convert the byte[] into a string using UTF8 encoding...
var bodyAsText = Encoding.UTF8.GetString(buffer);
//..and finally, assign the read body back to the request body, which is allowed because of EnableRewind()
request.Body = body;
return $"{request.Scheme} {request.Host}{request.Path} {request.QueryString} {bodyAsText}";
}
private async Task<string> FormatResponse(HttpResponse response)
{
//We need to read the response stream from the beginning...
response.Body.Seek(0, SeekOrigin.Begin);
//...and copy it into a string
string text = await new StreamReader(response.Body).ReadToEndAsync();
//We need to reset the reader for the response so that the client can read it.
response.Body.Seek(0, SeekOrigin.Begin);
//Return the string for the response, including the status code (e.g. 200, 404, 401, etc.)
return $"{response.StatusCode}: {text}";
}
}
And register the middleware : 并注册中间件:
app.UseMiddleware<RequestResponseLoggingMiddleware>();
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.