简体   繁体   中英

Merging multiple txt files into csv with python

*very new to Python I need to merge a large number of txt files from a single directory into one csv file. The text from the files needs to be converted into separate rows and columns (five columns in each file, N number of rows). I used the code from this question :

import os
import csv

dirpath = 'path_of_directory'
output = 'output_file.csv'
with open(output, 'w') as outfile:
    csvout = csv.writer(outfile)
    csvout.writerow(['FileName', 'Content'])

    files = os.listdir(dirpath)

    for filename in files:
        with open(dirpath + '/' + filename) as afile:
            csvout.writerow([filename, afile.read()])
            afile.close()

    outfile.close()

It works for me but it puts all the content from the file into one table cell. I read through a lot of Q&As and couldn't figure out how to modify the code in order to separate the content into different columns and rows.

You have to read the five words in a line with loop for all the lines. Next, write the five words along the filename as shown below. Hope that helps. I have added pseudo code for the concept. Hope that helps!!!

files = os.listdir(dirpath)

for filename in files:
    with open(dirpath + '/' + filename) as afile:
        #for line in afile.read()
        #     words_in_a_line = split_line_to_get_words(line) 
        #     csvout.writerow([filename, words_in_a_file])

        csvout.writerow([filename, afile.read()]) # delete this line
        afile.close()

outfile.close()

Here the function split_line_to_get_words will return a list of five words from the line. If the lines are separated according to space or comma, use them to get five words from a line.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM