简体   繁体   中英

C# Filestream not blocking until write/read operation complete

I'm trying to write a class that will copy a file from one location to another and report progress. The problem that I'm having is that when the application is run, the progress will shoot from 0 to 100% instantly, but the file is still copying in the background.

    public void Copy(string sourceFile, string destinationFile)
    {
        _stopWatch.Start();

        _sourceStream = new FileStream(srcName, FileMode.Open);
        _destinationStream = new FileStream(destName, FileMode.CreateNew);

        read();
        //On a 500mb file, execution will reach here in about a second.
    }

    private void read()
    {
        int i = _sourceStream.Read(_buffer, 0, bufferSize);

        _completedBytes += i;

        if (i != 0)
        {
            _destinationStream.Write(_buffer, 0, i);

            TriggerProgressUpdate();

            read();
        }
    }

    private void TriggerProgressUpdate()
    {
        if (OnCopyProgress != null)
        {
            CopyProgressEventArgs arg = new CopyProgressEventArgs();
            arg.CompleteBytes = _completedBytes;

            if (_totalBytes == 0)
                _totalBytes = new FileInfo(srcName).Length;

            arg.TotalBytes = _totalBytes;

            OnCopyProgress(this, arg);
        }
    }

What seems to be happening is that FileStream is merely queuing the operations in the OS, instead of blocking until the read or write is complete.

Is there any way to disable this functionality without causing a huge performance loss?

PS. I am using test source and destination variables, thats why they dont match the arguments.

Thanks Craig

I don't think it can be queuing the read operations... after all, you've got a byte array, it will have some data in after the Read call - that data had better be correct. It's probably only the write operations which are being buffered.

You could try calling Flush on the output stream periodically... I don't know quite how far the Flush will go in terms of the various levels of caching, but it may well wait until the data has actually been written. EDIT: If you know it's a FileStream , you can call Flush(true) which will wait until the data has actually been written to disk.

Note that you shouldn't do this too often, or performance will suffer significantly. You'll need to balance the granularity of progress accuracy with the performance penalty for taking more control instead of letting the OS optimize the disk access.

I'm concerned about your use of recursion here - on a very large file you may well blow up with a stack overflow for no good reason. (The CLR can sometimes optimize tail-recursive methods, but not always). I suggest you use a loop instead. That would also be more readable, IMO:

public void Copy()
{
    int bytesRead;
    while ((bytesRead = _sourceStream.Read(_buffer, 0, _buffer.Length)) > 0)
    {
        _destinationStream.Write(_buffer, 0, bytesRead);
        _completedBytes += bytesRead;
        TriggerProgressUpdate();
        if (someAppropriateCondition)
        {
            _destinationStream.Flush();
        }
    }
}

I hope you're disposing of the streams somewhere, by the way. Personally I try to avoid having disposable member variables if at all possible. Is there any reason you can't just use local variables in a using statement?

After investigating I found that using "FileOptions.WriteThrough" in a FileStream's constructor will disable write caching. This causes my progress to report correctly. It does however take a performance hit, the copy takes 13 seconds in windows and 20 second in my application. I'm going to try and optimize the code and adjust the buffer size to see if I can speeds things up a bit.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM