简体   繁体   中英

std::istream::get efficiency

c++ question.

for(i=1;i<10000;i++){
    cout << myfile.get();
}

Will program make 10000 IO operations on the file in HDD? (given that file is larger) If so, maybe it is better to read lets say 512 bytes to some buffer and then take char by char from there and then again copy 512 bytes and so on?

As others have said - try it. Tests I've done show that reading a large block in one go (using streams) can be up to twice as fast as depending solely on the stream's own buffering. However, this is dependent on things like buffer size and (I would expect) stream library implementation - I use g++.

您的操作系统将缓存该文件,因此您无需优化此文件即可正常使用。

ifstream已缓冲,因此,否。

Try it.

However, in many cases, the fastest operation will be to read the whole file at once, and then work on in-memory data.

But really, try out each strategy, and see what works best.

Keep in mind though, that regardless of the underlying file buffering mechanism, reading one byte at a time is slow . If nothing else, it calls the fairly slow IOStreams library 10000 times, when you could have done just a couple of calls.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM