简体   繁体   中英

How to write multiple text files from a text file in C++?

I have a txt file that has 500,000 lines, and each line has 5 columns. I want to read data from this file and write it into different 5000 txt files that have 100 lines each, starting from the first line to the last of the input file. Also, the filename is output with the order number, say "1_Hello.txt", which has the 1st line to 100th line, "2_Hello.txt", which has the 101st line to 200th line, and so on, until "5000_Hello.txt", which has the 499901st line to 500000th line.

I used to run the following code to write files that are less than 10 files. But How can I write it in the case of 5000 text files? Any help would be appreciated.

 #include <iostream>
 #include <fstream>
 #include <vector>
 using namespace std;

 int main() {

 vector<string> VecData;
 string data;

 ifstream in("mytext.txt");
 while (in >> data) {
    VecData.push_back(data);
 }
 in.close();


 ofstream mynewfile1;
 char filename[]="0_Hello.txt";

 int i, k=0, l=0;
 while(l<VecData.size()){
    for(int j='1';j<='3';j++){            
        filename[0]=j;
        mynewfile1.open(filename);
        for( i=k; i<k+((int)VecData.size()/3);i+=5){
            mynewfile1<<VecData[i]<<"\t";
            mynewfile1<<VecData[i+1]<<"\t";
            mynewfile1<<VecData[i+2]<<"\t";
            mynewfile1<<VecData[i+3]<<"\t";
            mynewfile1<<VecData[i+4]<<endl;
        }
        mynewfile1.close();
        l=i;
        k+=(int)VecData.size()/3; 
    }
 }

 cout<<"Done\n";
 return 0;
}

You're working too hard – you don't need to read the entire input first, and you don't need to care about the structure of each line.

Read and write line-by-line, a hundred lines at a time.
Stop when there is nothing more to read.

Something like this should do it:

int main()
{
    std::ifstream in("mytext.txt");
    int index = 0;
    std::string line;
    while (in)
    {
        std::string name(std::to_string(index) + "_Hello.txt");
        std::ofstream out(name);
        for (int i = 0; i < 100 && std::getline(in, line); i++)
        {
            out << line << '\n';
        }
        index += 1;
    }
    cout << "Done\n";
}

You've already gotten an answer but I'll give an alternative that uses std::copy_n , std::istream_iterator and std::ostream_iterator .

This copies 100 lines at a time to the output current output file.

I've added a class wrapping a std::string to be able to provide my own streaming operators for the string to make it read one line at a time.

#include <algorithm>
#include <fstream>
#include <iostream>

struct Line {
    std::string str;
};

std::istream& operator>>(std::istream& is, Line& l) {
    return std::getline(is, l.str);
}

std::ostream& operator<<(std::ostream& os, const Line& l) {
    return os << l.str << '\n';
}

int main() {
    if(std::ifstream in("mytext.txt"); in) {
        for(unsigned count = 1;; ++count) {

            // Constructing an istream_operator makes it read one value from the stream
            // and if that fails, it'll set the eofbit on the stream.

            std::istream_iterator<Line> in_it(in);
            if(!in) break; // EOF - nothing more in the input file

            // Open the output file and copy 100 lines into it:
            if(std::ofstream out(std::to_string(count) + "_Hello.txt"); out) {
                std::copy_n(in_it, 100, std::ostream_iterator<Line>(out));
            }
        }
    }
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM