简体   繁体   中英

C# async technique for writing to log file but first wait for previous log I/O to finish

I have a need to write to a log file on occasion, sometimes a small flurry of rapid log requests, but don't want to wait for the I/O. However, what I DO want to wait for is for the I/O to complete (as in, stream closed) before the NEXT log entry is written. So if the first log I/O request is busy, further I/O requests will politely wait in line for their turn and not stomp all over each other.

I've cobbled together an idea, is there any reason why this won't work? Using Framework 4.7.2 and 4.8, asp.net MVC web app.

I've defined a static Task t elsewhere so it's global to the app.

public static void ErrorLog(string file, string error)
{
    if (t != null)
        t.Wait();
    //using file system async - doesn't use thread pool
    var f = new FileStream(Path.Combine(HttpRuntime.AppDomainAppPath, "logs", file), FileMode.Append, FileAccess.Write, FileShare.None, bufferSize: 4096, useAsync: true);
    var sWriter = new StreamWriter(f);
    t = sWriter.WriteLineAsync($"### {error}").ContinueWith(c => sWriter.Close());
}

This seems to be working, with a simple stress test like:

ErrorLog("test.txt", string.Join(" ", Enumerable.Range(i++, 1000)));

Repeated a bunch of times. Variable i is just so I can see each write in order in the log.

The beauty is that I don't need to rewrite all my requests to be async and convert ErrorLog into a true async function. Which yeah would be ideal but it's too much code to modify today.

My concern is the last write, though it does seem to complete before the AppDomain is torn down when the web request completes, I don't think that's any kind of guarantee... I wonder if I need to do a t.Wait() at the end of each incoming web request that may write to the log... just to make sure the last log entry is complete before ending the request...

Your issue is that you are not awaiting the Task result of the write, which means that the AppDomain can be torn down in the middle.

Ideally if you were just going to wait on the write, you would do this:

public static async Task ErrorLog(string file, string error)
{
    //using file system async - doesn't use thread pool
    using (var f = new FileStream(Path.Combine(HttpRuntime.AppDomainAppPath, "logs", file), FileMode.Append, FileAccess.Write, FileShare.None, bufferSize: 4096, useAsync: true))
    using (var sWriter = new StreamWriter(f))
    {
        await sWriter.WriteLineAsync($"### {error}"):
    }
}

However, this does not allow you to hand off the log writing without waiting. Instead you need to implement a BackgroundService and a queue of logs to write.

A very rough-and-ready implementation would be something like this:

public class LoggingService : BackgroundService
{
    private Channel<(string file, string error)> _channel = new Channel.CreateUnbounded<(string, string)>();

    protected override async Task ExecuteAsync(CancellationToken token)
    {
        while(true)
        {
            try
            {
                var (file, error) = await _channel.Reader.ReadAsync(token);
                await WriteLog(file, error, token);
            }
            catch (OperationCanceledException)
            {
                break;
            }
        }
    }

    private async Task WriteLog(string file, string error, CancellationToken token)
    {
        using (var f = new FileStream(Path.Combine(HttpRuntime.AppDomainAppPath, "logs", file), FileMode.Append, FileAccess.Write, FileShare.None, bufferSize: 4096, useAsync: true))
        using (var sWriter = new StreamWriter(f))
        {
            await sWriter.WriteLineAsync($"### {error}".AsMemory(), token):
        }
    }

    public async Task QueueErrorLog(string file, string error)
    {
        await _channel.Writer.WriteAsync((file, error));
    }
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM